Dec 03 11:03:34 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 11:03:35 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:35 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 11:03:36 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 11:03:36 crc kubenswrapper[4702]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 11:03:36 crc kubenswrapper[4702]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 11:03:36 crc kubenswrapper[4702]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 11:03:36 crc kubenswrapper[4702]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 11:03:36 crc kubenswrapper[4702]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 11:03:36 crc kubenswrapper[4702]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.608511 4702 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614452 4702 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614482 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614489 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614497 4702 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614505 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614512 4702 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614520 4702 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614526 4702 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614532 4702 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614538 4702 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614543 4702 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614549 4702 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614554 4702 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614559 4702 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614566 4702 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614571 4702 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614576 4702 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614582 4702 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614588 4702 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614593 4702 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614599 4702 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614605 4702 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614612 4702 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614619 4702 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614625 4702 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614631 4702 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614663 4702 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614671 4702 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614678 4702 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614686 4702 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614692 4702 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614698 4702 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614705 4702 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614711 4702 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614719 4702 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614725 4702 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614732 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614740 4702 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614746 4702 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614777 4702 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614784 4702 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614792 4702 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614799 4702 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614806 4702 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614812 4702 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614819 4702 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614825 4702 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614832 4702 feature_gate.go:330] unrecognized feature gate: Example Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614837 4702 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614843 4702 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614849 4702 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614855 4702 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614861 4702 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614866 4702 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614873 4702 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614878 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614884 4702 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614890 4702 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614896 4702 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614902 4702 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614908 4702 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614913 4702 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614919 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614926 4702 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614933 4702 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614938 4702 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614944 4702 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614950 4702 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614958 4702 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614964 4702 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.614970 4702 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615454 4702 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615474 4702 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615487 4702 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615498 4702 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615510 4702 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615517 4702 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615527 4702 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615536 4702 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615543 4702 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615549 4702 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615556 4702 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615565 4702 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615571 4702 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615578 4702 flags.go:64] FLAG: --cgroup-root="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615584 4702 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615592 4702 flags.go:64] FLAG: --client-ca-file="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615599 4702 flags.go:64] FLAG: --cloud-config="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615605 4702 flags.go:64] FLAG: --cloud-provider="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615613 4702 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615622 4702 flags.go:64] FLAG: --cluster-domain="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615629 4702 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615635 4702 flags.go:64] FLAG: --config-dir="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615642 4702 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615649 4702 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615679 4702 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615686 4702 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615692 4702 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615699 4702 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615706 4702 flags.go:64] FLAG: --contention-profiling="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615713 4702 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615719 4702 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615726 4702 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615733 4702 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615742 4702 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615775 4702 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615782 4702 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615788 4702 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615795 4702 flags.go:64] FLAG: --enable-server="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615801 4702 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615809 4702 flags.go:64] FLAG: --event-burst="100" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615833 4702 flags.go:64] FLAG: --event-qps="50" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615840 4702 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615846 4702 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615853 4702 flags.go:64] FLAG: --eviction-hard="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615863 4702 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615869 4702 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615876 4702 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615883 4702 flags.go:64] FLAG: --eviction-soft="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615890 4702 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615896 4702 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615904 4702 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615911 4702 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615919 4702 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615925 4702 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615932 4702 flags.go:64] FLAG: --feature-gates="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615939 4702 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615946 4702 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615953 4702 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615959 4702 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615966 4702 flags.go:64] FLAG: --healthz-port="10248" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615973 4702 flags.go:64] FLAG: --help="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615979 4702 flags.go:64] FLAG: --hostname-override="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615985 4702 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615991 4702 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.615998 4702 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616004 4702 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616010 4702 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616016 4702 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616022 4702 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616028 4702 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616034 4702 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616040 4702 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616047 4702 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616053 4702 flags.go:64] FLAG: --kube-reserved="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616061 4702 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616067 4702 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616074 4702 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616080 4702 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616086 4702 flags.go:64] FLAG: --lock-file="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616095 4702 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616102 4702 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616108 4702 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616118 4702 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616124 4702 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616131 4702 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616138 4702 flags.go:64] FLAG: --logging-format="text" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616145 4702 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616153 4702 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616159 4702 flags.go:64] FLAG: --manifest-url="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616166 4702 flags.go:64] FLAG: --manifest-url-header="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616175 4702 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616182 4702 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616189 4702 flags.go:64] FLAG: --max-pods="110" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616195 4702 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616202 4702 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616208 4702 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616214 4702 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616222 4702 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616228 4702 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616234 4702 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616249 4702 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616255 4702 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616261 4702 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616267 4702 flags.go:64] FLAG: --pod-cidr="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616273 4702 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616284 4702 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616292 4702 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616298 4702 flags.go:64] FLAG: --pods-per-core="0" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616304 4702 flags.go:64] FLAG: --port="10250" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616310 4702 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616317 4702 flags.go:64] FLAG: --provider-id="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616323 4702 flags.go:64] FLAG: --qos-reserved="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616330 4702 flags.go:64] FLAG: --read-only-port="10255" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616337 4702 flags.go:64] FLAG: --register-node="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616343 4702 flags.go:64] FLAG: --register-schedulable="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616349 4702 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616360 4702 flags.go:64] FLAG: --registry-burst="10" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616366 4702 flags.go:64] FLAG: --registry-qps="5" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616373 4702 flags.go:64] FLAG: --reserved-cpus="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616380 4702 flags.go:64] FLAG: --reserved-memory="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616391 4702 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616398 4702 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616407 4702 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616414 4702 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616420 4702 flags.go:64] FLAG: --runonce="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616426 4702 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616433 4702 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616439 4702 flags.go:64] FLAG: --seccomp-default="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616446 4702 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616452 4702 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616458 4702 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616466 4702 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616472 4702 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616478 4702 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616484 4702 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616490 4702 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616496 4702 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616503 4702 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616509 4702 flags.go:64] FLAG: --system-cgroups="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616515 4702 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616525 4702 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616532 4702 flags.go:64] FLAG: --tls-cert-file="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616538 4702 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616547 4702 flags.go:64] FLAG: --tls-min-version="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616553 4702 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616559 4702 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616566 4702 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616572 4702 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616579 4702 flags.go:64] FLAG: --v="2" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616588 4702 flags.go:64] FLAG: --version="false" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616597 4702 flags.go:64] FLAG: --vmodule="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616605 4702 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.616612 4702 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616847 4702 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616862 4702 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616869 4702 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616874 4702 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616880 4702 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616885 4702 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616891 4702 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616897 4702 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616903 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616908 4702 feature_gate.go:330] unrecognized feature gate: Example Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616914 4702 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616921 4702 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616928 4702 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616934 4702 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616941 4702 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616946 4702 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616951 4702 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616958 4702 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616963 4702 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616968 4702 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616973 4702 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616978 4702 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616983 4702 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616990 4702 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.616995 4702 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617000 4702 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617006 4702 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617011 4702 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617017 4702 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617022 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617028 4702 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617034 4702 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617039 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617044 4702 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617050 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617055 4702 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617061 4702 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617066 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617071 4702 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617076 4702 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617082 4702 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617087 4702 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617095 4702 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617101 4702 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617107 4702 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617113 4702 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617118 4702 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617123 4702 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617128 4702 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617134 4702 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617141 4702 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617148 4702 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617155 4702 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617161 4702 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617167 4702 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617178 4702 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617184 4702 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617189 4702 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617195 4702 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617200 4702 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617205 4702 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617210 4702 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617216 4702 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617221 4702 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617226 4702 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617231 4702 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617236 4702 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617242 4702 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617248 4702 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617253 4702 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.617258 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.617282 4702 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.629447 4702 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.629515 4702 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629719 4702 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629752 4702 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629793 4702 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629804 4702 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629816 4702 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629830 4702 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629842 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629855 4702 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629866 4702 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629877 4702 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629889 4702 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629902 4702 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629912 4702 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629963 4702 feature_gate.go:330] unrecognized feature gate: Example Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629973 4702 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629984 4702 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.629995 4702 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630006 4702 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630017 4702 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630027 4702 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630036 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630048 4702 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630065 4702 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630079 4702 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630095 4702 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630108 4702 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630123 4702 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630135 4702 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630146 4702 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630156 4702 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630165 4702 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630174 4702 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630182 4702 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630194 4702 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630202 4702 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630212 4702 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630220 4702 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630228 4702 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630237 4702 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630245 4702 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630254 4702 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630262 4702 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630270 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630279 4702 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630290 4702 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630300 4702 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630308 4702 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630317 4702 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630325 4702 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630334 4702 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630342 4702 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630350 4702 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630358 4702 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630366 4702 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630374 4702 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630381 4702 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630389 4702 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630397 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630408 4702 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630416 4702 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630423 4702 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630431 4702 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630439 4702 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630447 4702 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630455 4702 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630464 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630472 4702 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630479 4702 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630488 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630495 4702 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630504 4702 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.630521 4702 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630876 4702 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630894 4702 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630905 4702 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630917 4702 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630927 4702 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630936 4702 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630945 4702 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630954 4702 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630963 4702 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630973 4702 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630983 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.630993 4702 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631004 4702 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631012 4702 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631021 4702 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631029 4702 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631038 4702 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631046 4702 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631054 4702 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631061 4702 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631069 4702 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631077 4702 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631087 4702 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631095 4702 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631103 4702 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631112 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631120 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631128 4702 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631136 4702 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631143 4702 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631151 4702 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631159 4702 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631167 4702 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631175 4702 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631183 4702 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631213 4702 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631224 4702 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631234 4702 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631242 4702 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631250 4702 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631258 4702 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631265 4702 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631274 4702 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631282 4702 feature_gate.go:330] unrecognized feature gate: Example Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631290 4702 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631298 4702 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631308 4702 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631317 4702 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631325 4702 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631335 4702 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631344 4702 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631354 4702 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631363 4702 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631372 4702 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631381 4702 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631389 4702 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631398 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631406 4702 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631415 4702 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631423 4702 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631430 4702 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631438 4702 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631446 4702 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631454 4702 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631462 4702 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631469 4702 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631480 4702 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631488 4702 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631495 4702 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631504 4702 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.631512 4702 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.631527 4702 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.632295 4702 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.637585 4702 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.637814 4702 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.638925 4702 server.go:997] "Starting client certificate rotation" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.638995 4702 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.639299 4702 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-12 19:24:05.711900144 +0000 UTC Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.639430 4702 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.654389 4702 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.658102 4702 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.658306 4702 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.675685 4702 log.go:25] "Validated CRI v1 runtime API" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.705071 4702 log.go:25] "Validated CRI v1 image API" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.707831 4702 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.711390 4702 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-10-59-19-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.711438 4702 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.732903 4702 manager.go:217] Machine: {Timestamp:2025-12-03 11:03:36.73099403 +0000 UTC m=+0.566922544 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6a3f38b6-c08e-4968-a85f-e1166e8e8498 BootID:d83e9c9f-fe89-474f-892c-403dd3951eb1 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:76:1f:b5 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:76:1f:b5 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9e:85:91 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fa:8b:20 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9f:b8:51 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:cc:3d:e3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6a:8e:c1:de:76:2e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:de:09:c2:57:b1:78 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.733328 4702 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.733550 4702 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.734346 4702 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.734614 4702 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.734688 4702 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.735069 4702 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.735084 4702 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.735379 4702 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.735458 4702 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.735796 4702 state_mem.go:36] "Initialized new in-memory state store" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.735977 4702 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.737749 4702 kubelet.go:418] "Attempting to sync node with API server" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.737799 4702 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.737853 4702 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.737870 4702 kubelet.go:324] "Adding apiserver pod source" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.737905 4702 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.740480 4702 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.741004 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.741204 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.741092 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.741482 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.746272 4702 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.747509 4702 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748530 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748586 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748599 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748611 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748629 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748641 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748663 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748693 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748714 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748726 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748742 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748751 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.748813 4702 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.750451 4702 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.751716 4702 server.go:1280] "Started kubelet" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.752025 4702 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.752032 4702 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.753181 4702 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.753624 4702 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.753664 4702 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.753722 4702 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 15:13:51.134426034 +0000 UTC Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.754032 4702 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.754056 4702 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.754061 4702 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.754175 4702 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 11:03:36 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.755138 4702 server.go:460] "Adding debug handlers to kubelet server" Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.755219 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="200ms" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.755505 4702 factory.go:55] Registering systemd factory Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.755542 4702 factory.go:221] Registration of the systemd container factory successfully Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.756081 4702 factory.go:153] Registering CRI-O factory Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.756103 4702 factory.go:221] Registration of the crio container factory successfully Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.756193 4702 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.756245 4702 factory.go:103] Registering Raw factory Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.756142 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.756329 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.756277 4702 manager.go:1196] Started watching for new ooms in manager Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.754324 4702 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dafb4075d5e5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 11:03:36.751636063 +0000 UTC m=+0.587564547,LastTimestamp:2025-12-03 11:03:36.751636063 +0000 UTC m=+0.587564547,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.757386 4702 manager.go:319] Starting recovery of all containers Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841546 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841661 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841684 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841697 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841708 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841720 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841791 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841869 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841931 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841952 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841963 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.841975 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842036 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842150 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842203 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842244 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842273 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842311 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842336 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842377 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.842411 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.844922 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845244 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845261 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845299 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845314 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845333 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845350 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845361 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845371 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845383 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845395 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845406 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845433 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845452 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845480 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845497 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845509 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845521 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845531 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845545 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845556 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845568 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845580 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845591 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845604 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.845617 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846312 4702 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846345 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846365 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846397 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846418 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846434 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846461 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846477 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846493 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846511 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846529 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846545 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846559 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846582 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846607 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846624 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846637 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846653 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846670 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846685 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846700 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846716 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846731 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846749 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846781 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846806 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846825 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846842 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846859 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846881 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846901 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846920 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846941 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846961 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846982 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.846999 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847020 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847038 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847054 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847072 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847090 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847108 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847125 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847141 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847159 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847174 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847188 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847209 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847225 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847240 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847261 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847280 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847298 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847317 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847339 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847358 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847376 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847394 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847420 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847438 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847453 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847468 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847484 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847501 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847517 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847533 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847547 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847561 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847575 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847587 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847599 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847613 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847627 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847641 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847656 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847672 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847687 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847700 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847714 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847729 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847743 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847776 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847789 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847803 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847817 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847831 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847843 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847857 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847871 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847883 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847899 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847917 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847935 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847953 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847971 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.847994 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848009 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848023 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848037 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848054 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848068 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848087 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848099 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848112 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848125 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848141 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848153 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848167 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848180 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848194 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848206 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848261 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848273 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848287 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848301 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848314 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848329 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848343 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848358 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848374 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848389 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848402 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848415 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848429 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848442 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848454 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848466 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848482 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848496 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848508 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848520 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848532 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848544 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848558 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848574 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848589 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848607 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848623 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848644 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848659 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848676 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848716 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848731 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848745 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848821 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848836 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848849 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848862 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848875 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848889 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848902 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848916 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848939 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848955 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848970 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.848989 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.849006 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.849022 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.849039 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.849061 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.849076 4702 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.849096 4702 reconstruct.go:97] "Volume reconstruction finished" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.849107 4702 reconciler.go:26] "Reconciler: start to sync state" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.850163 4702 manager.go:324] Recovery completed Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.854546 4702 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.860495 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.862642 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.862685 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.862749 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.865140 4702 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.865191 4702 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.865222 4702 state_mem.go:36] "Initialized new in-memory state store" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.873825 4702 policy_none.go:49] "None policy: Start" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.875152 4702 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.875182 4702 state_mem.go:35] "Initializing new in-memory state store" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.923735 4702 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.926496 4702 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.926750 4702 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.926833 4702 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.926922 4702 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 11:03:36 crc kubenswrapper[4702]: W1203 11:03:36.927497 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.927575 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.955467 4702 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.956207 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="400ms" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.961093 4702 manager.go:334] "Starting Device Plugin manager" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.961159 4702 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.961171 4702 server.go:79] "Starting device plugin registration server" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.961733 4702 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.961785 4702 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.962026 4702 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.962139 4702 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 11:03:36 crc kubenswrapper[4702]: I1203 11:03:36.962149 4702 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 11:03:36 crc kubenswrapper[4702]: E1203 11:03:36.969292 4702 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.027293 4702 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.027513 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.029441 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.029518 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.029533 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.029823 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.030130 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.030200 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031279 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031338 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031357 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031428 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031450 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031459 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031658 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031821 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.031864 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.032792 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.032847 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.032860 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.032916 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.032967 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.032982 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.033261 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.033316 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.033346 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.034504 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.034525 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.034536 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.034569 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.034598 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.034609 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.034786 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.035029 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.035085 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.036264 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.036304 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.036319 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.036393 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.036424 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.036437 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.036725 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.036788 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.037843 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.037878 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.037910 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.062079 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.063372 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.063427 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.063439 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.063471 4702 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 11:03:37 crc kubenswrapper[4702]: E1203 11:03:37.064171 4702 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.152543 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153031 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153152 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153260 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153364 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153470 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153566 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153664 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153790 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153896 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.153991 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.154092 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.154202 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.154294 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.154383 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.255938 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256005 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256040 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256061 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256089 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256114 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256142 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256164 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256187 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256209 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256230 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256252 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256272 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256294 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256317 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256861 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256946 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256971 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256971 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.256997 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257020 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257042 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257050 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257099 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257121 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257126 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257144 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257169 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257185 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.257192 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.264341 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.266125 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.266200 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.266234 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.266280 4702 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 11:03:37 crc kubenswrapper[4702]: E1203 11:03:37.267029 4702 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.355401 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: E1203 11:03:37.357663 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="800ms" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.363835 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: W1203 11:03:37.383096 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-53822f52e22530504075b35d41bdc345a153f00609e465d191619c00a9712547 WatchSource:0}: Error finding container 53822f52e22530504075b35d41bdc345a153f00609e465d191619c00a9712547: Status 404 returned error can't find the container with id 53822f52e22530504075b35d41bdc345a153f00609e465d191619c00a9712547 Dec 03 11:03:37 crc kubenswrapper[4702]: W1203 11:03:37.385681 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-3d10bb65973c67c7acc95c77da97da5eb59b3680a7a73fe7f503ea52d2ace19b WatchSource:0}: Error finding container 3d10bb65973c67c7acc95c77da97da5eb59b3680a7a73fe7f503ea52d2ace19b: Status 404 returned error can't find the container with id 3d10bb65973c67c7acc95c77da97da5eb59b3680a7a73fe7f503ea52d2ace19b Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.390380 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.409148 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.412740 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:37 crc kubenswrapper[4702]: W1203 11:03:37.414640 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-83a8c38c684ebd8e850fa47e96671e8f0afb083eef796db369522800e019ffdc WatchSource:0}: Error finding container 83a8c38c684ebd8e850fa47e96671e8f0afb083eef796db369522800e019ffdc: Status 404 returned error can't find the container with id 83a8c38c684ebd8e850fa47e96671e8f0afb083eef796db369522800e019ffdc Dec 03 11:03:37 crc kubenswrapper[4702]: W1203 11:03:37.443281 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f4a3d1786b8600b800121eedd1512e1dd44d28dae0027c4a6810c7960785f644 WatchSource:0}: Error finding container f4a3d1786b8600b800121eedd1512e1dd44d28dae0027c4a6810c7960785f644: Status 404 returned error can't find the container with id f4a3d1786b8600b800121eedd1512e1dd44d28dae0027c4a6810c7960785f644 Dec 03 11:03:37 crc kubenswrapper[4702]: W1203 11:03:37.444372 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-38ada1396951b325a1bc93752b1e36a9db622dd10945b76a20d25a82972b2f32 WatchSource:0}: Error finding container 38ada1396951b325a1bc93752b1e36a9db622dd10945b76a20d25a82972b2f32: Status 404 returned error can't find the container with id 38ada1396951b325a1bc93752b1e36a9db622dd10945b76a20d25a82972b2f32 Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.667791 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.669175 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.669206 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.669218 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.669245 4702 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 11:03:37 crc kubenswrapper[4702]: E1203 11:03:37.669565 4702 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.752128 4702 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.754565 4702 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 05:58:25.223273007 +0000 UTC Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.754616 4702 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 498h54m47.468660392s for next certificate rotation Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.937435 4702 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="67a56fca76d1fc1f95f670e902fbbc9dc3aecc868336e065061e1955b3b364b8" exitCode=0 Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.937533 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"67a56fca76d1fc1f95f670e902fbbc9dc3aecc868336e065061e1955b3b364b8"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.937672 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"53822f52e22530504075b35d41bdc345a153f00609e465d191619c00a9712547"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.937788 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.938935 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.938964 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.938972 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.939477 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.939579 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"38ada1396951b325a1bc93752b1e36a9db622dd10945b76a20d25a82972b2f32"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.941742 4702 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc" exitCode=0 Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.941797 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.941856 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f4a3d1786b8600b800121eedd1512e1dd44d28dae0027c4a6810c7960785f644"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.941974 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.943107 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.943143 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.943157 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.943906 4702 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0" exitCode=0 Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.943970 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.943991 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"83a8c38c684ebd8e850fa47e96671e8f0afb083eef796db369522800e019ffdc"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.944067 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.945039 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.945098 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.945109 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.946427 4702 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b" exitCode=0 Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.946471 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.946496 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3d10bb65973c67c7acc95c77da97da5eb59b3680a7a73fe7f503ea52d2ace19b"} Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.946596 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.947330 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.947353 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.947364 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.949101 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.949710 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.949735 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:37 crc kubenswrapper[4702]: I1203 11:03:37.949744 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:37 crc kubenswrapper[4702]: W1203 11:03:37.956114 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:37 crc kubenswrapper[4702]: E1203 11:03:37.956269 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:38 crc kubenswrapper[4702]: E1203 11:03:38.158595 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="1.6s" Dec 03 11:03:38 crc kubenswrapper[4702]: W1203 11:03:38.206351 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:38 crc kubenswrapper[4702]: E1203 11:03:38.206533 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:38 crc kubenswrapper[4702]: W1203 11:03:38.351326 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:38 crc kubenswrapper[4702]: E1203 11:03:38.351416 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.470304 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.472365 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.472399 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.472411 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.472457 4702 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 11:03:38 crc kubenswrapper[4702]: E1203 11:03:38.473156 4702 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Dec 03 11:03:38 crc kubenswrapper[4702]: W1203 11:03:38.517350 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:38 crc kubenswrapper[4702]: E1203 11:03:38.517602 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.752101 4702 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.759179 4702 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 11:03:38 crc kubenswrapper[4702]: E1203 11:03:38.761186 4702 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.951338 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.951427 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.951450 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.951430 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.952940 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.952973 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.952983 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.954724 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.954776 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.954792 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.954924 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.957637 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.957688 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.957701 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.960332 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.960512 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.960542 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.964149 4702 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004" exitCode=0 Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.964216 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.964348 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.965848 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.965901 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.965912 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.969273 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"887a43dbd5c582374f567dc236512d6aa06d661da270a15750867caaef3f2636"} Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.969387 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.970485 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.970512 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:38 crc kubenswrapper[4702]: I1203 11:03:38.970520 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:39 crc kubenswrapper[4702]: E1203 11:03:39.585946 4702 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dafb4075d5e5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 11:03:36.751636063 +0000 UTC m=+0.587564547,LastTimestamp:2025-12-03 11:03:36.751636063 +0000 UTC m=+0.587564547,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.752001 4702 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:39 crc kubenswrapper[4702]: E1203 11:03:39.760167 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="3.2s" Dec 03 11:03:39 crc kubenswrapper[4702]: W1203 11:03:39.834136 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.176:6443: connect: connection refused Dec 03 11:03:39 crc kubenswrapper[4702]: E1203 11:03:39.834305 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.176:6443: connect: connection refused" logger="UnhandledError" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.850523 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.973916 4702 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18" exitCode=0 Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.973986 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18"} Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.974107 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.975169 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.975205 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.975217 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.979963 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e"} Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.980015 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.980047 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.980047 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.980130 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.980179 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.980018 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e"} Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.981592 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.981615 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.981654 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.981670 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.981693 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.981711 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.981720 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.981627 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.982495 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.984321 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.984409 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:39 crc kubenswrapper[4702]: I1203 11:03:39.984438 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.074191 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.075448 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.075486 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.075496 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.075521 4702 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 11:03:40 crc kubenswrapper[4702]: E1203 11:03:40.075902 4702 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.176:6443: connect: connection refused" node="crc" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.480451 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.988706 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9"} Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.988745 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.988795 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b"} Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.988815 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.988817 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978"} Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.988833 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25"} Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.988901 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.990020 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.990065 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.990083 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.990202 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.990233 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:40 crc kubenswrapper[4702]: I1203 11:03:40.990246 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.998190 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.998271 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.998293 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.998187 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4"} Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.999648 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.999695 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.999711 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.999741 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.999829 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:41 crc kubenswrapper[4702]: I1203 11:03:41.999854 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:42 crc kubenswrapper[4702]: I1203 11:03:42.049784 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:42 crc kubenswrapper[4702]: I1203 11:03:42.049986 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:42 crc kubenswrapper[4702]: I1203 11:03:42.051423 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:42 crc kubenswrapper[4702]: I1203 11:03:42.051499 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:42 crc kubenswrapper[4702]: I1203 11:03:42.051530 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:42 crc kubenswrapper[4702]: I1203 11:03:42.057787 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:42 crc kubenswrapper[4702]: I1203 11:03:42.787106 4702 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.000856 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.000936 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.002386 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.002419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.002458 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.002485 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.002460 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.002677 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.277045 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.278596 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.278668 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.278685 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.278720 4702 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.633675 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.837045 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.837287 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.837349 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.839382 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.839438 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:43 crc kubenswrapper[4702]: I1203 11:03:43.839456 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.003533 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.005322 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.005388 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.005408 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.792652 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.792918 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.794395 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.794450 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:44 crc kubenswrapper[4702]: I1203 11:03:44.794469 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.647725 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.648060 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.650036 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.650127 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.650146 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.856620 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.857017 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.858941 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.859007 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:45 crc kubenswrapper[4702]: I1203 11:03:45.859022 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:46 crc kubenswrapper[4702]: I1203 11:03:46.088271 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 11:03:46 crc kubenswrapper[4702]: I1203 11:03:46.088569 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:46 crc kubenswrapper[4702]: I1203 11:03:46.090483 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:46 crc kubenswrapper[4702]: I1203 11:03:46.090546 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:46 crc kubenswrapper[4702]: I1203 11:03:46.090562 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:46 crc kubenswrapper[4702]: E1203 11:03:46.969558 4702 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 11:03:47 crc kubenswrapper[4702]: I1203 11:03:47.935335 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:47 crc kubenswrapper[4702]: I1203 11:03:47.935673 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:47 crc kubenswrapper[4702]: I1203 11:03:47.937315 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:47 crc kubenswrapper[4702]: I1203 11:03:47.937369 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:47 crc kubenswrapper[4702]: I1203 11:03:47.937387 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:47 crc kubenswrapper[4702]: I1203 11:03:47.939588 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:48 crc kubenswrapper[4702]: I1203 11:03:48.014154 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:48 crc kubenswrapper[4702]: I1203 11:03:48.015382 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:48 crc kubenswrapper[4702]: I1203 11:03:48.015418 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:48 crc kubenswrapper[4702]: I1203 11:03:48.015438 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:50 crc kubenswrapper[4702]: W1203 11:03:50.168916 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 11:03:50 crc kubenswrapper[4702]: I1203 11:03:50.169284 4702 trace.go:236] Trace[299384544]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 11:03:40.167) (total time: 10001ms): Dec 03 11:03:50 crc kubenswrapper[4702]: Trace[299384544]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:03:50.168) Dec 03 11:03:50 crc kubenswrapper[4702]: Trace[299384544]: [10.001680787s] [10.001680787s] END Dec 03 11:03:50 crc kubenswrapper[4702]: E1203 11:03:50.169334 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 11:03:50 crc kubenswrapper[4702]: I1203 11:03:50.480214 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 11:03:50 crc kubenswrapper[4702]: I1203 11:03:50.480324 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 11:03:50 crc kubenswrapper[4702]: I1203 11:03:50.752472 4702 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 11:03:50 crc kubenswrapper[4702]: I1203 11:03:50.936268 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 11:03:50 crc kubenswrapper[4702]: I1203 11:03:50.936375 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 11:03:51 crc kubenswrapper[4702]: W1203 11:03:51.166323 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 11:03:51 crc kubenswrapper[4702]: I1203 11:03:51.166490 4702 trace.go:236] Trace[405603169]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 11:03:41.164) (total time: 10001ms): Dec 03 11:03:51 crc kubenswrapper[4702]: Trace[405603169]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:03:51.166) Dec 03 11:03:51 crc kubenswrapper[4702]: Trace[405603169]: [10.001977183s] [10.001977183s] END Dec 03 11:03:51 crc kubenswrapper[4702]: E1203 11:03:51.166537 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 11:03:51 crc kubenswrapper[4702]: W1203 11:03:51.286746 4702 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 11:03:51 crc kubenswrapper[4702]: I1203 11:03:51.286870 4702 trace.go:236] Trace[872929911]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 11:03:41.285) (total time: 10001ms): Dec 03 11:03:51 crc kubenswrapper[4702]: Trace[872929911]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:03:51.286) Dec 03 11:03:51 crc kubenswrapper[4702]: Trace[872929911]: [10.001473471s] [10.001473471s] END Dec 03 11:03:51 crc kubenswrapper[4702]: E1203 11:03:51.286896 4702 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 11:03:51 crc kubenswrapper[4702]: I1203 11:03:51.349889 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 11:03:51 crc kubenswrapper[4702]: I1203 11:03:51.349996 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 11:03:54 crc kubenswrapper[4702]: I1203 11:03:54.453484 4702 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 11:03:55 crc kubenswrapper[4702]: I1203 11:03:55.490235 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:55 crc kubenswrapper[4702]: I1203 11:03:55.490503 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:55 crc kubenswrapper[4702]: I1203 11:03:55.492396 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:55 crc kubenswrapper[4702]: I1203 11:03:55.492475 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:55 crc kubenswrapper[4702]: I1203 11:03:55.492491 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:55 crc kubenswrapper[4702]: I1203 11:03:55.497457 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.039720 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.041267 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.041353 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.041375 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.112873 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.113082 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.114584 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.114650 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.114669 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.127046 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.310949 4702 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.346261 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.348964 4702 trace.go:236] Trace[191454355]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 11:03:43.764) (total time: 12584ms): Dec 03 11:03:56 crc kubenswrapper[4702]: Trace[191454355]: ---"Objects listed" error: 12584ms (11:03:56.348) Dec 03 11:03:56 crc kubenswrapper[4702]: Trace[191454355]: [12.584175533s] [12.584175533s] END Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.349020 4702 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.351392 4702 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.352817 4702 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.364748 4702 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.390271 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59302->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.390316 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59308->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.390420 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59308->192.168.126.11:17697: read: connection reset by peer" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.390336 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:59302->192.168.126.11:17697: read: connection reset by peer" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.390854 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.390880 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.391006 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.391021 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.584816 4702 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.749781 4702 apiserver.go:52] "Watching apiserver" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.752344 4702 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.752658 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.753721 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.753808 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.753737 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.753882 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.753936 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.754001 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.754053 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.754267 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.754321 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.754981 4702 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.755770 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.755842 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.755864 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.756269 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.756274 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.756838 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.756860 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.758654 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.758880 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.785301 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.798526 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.808827 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.818884 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.828427 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.837873 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.848375 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853353 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853417 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853448 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853477 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853506 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853527 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853549 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853564 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853580 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853600 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853618 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853635 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853694 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853746 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853783 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853802 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853818 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853839 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853854 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853889 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853921 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853938 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853956 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853975 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.853990 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854006 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854022 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854042 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854028 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854086 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854058 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854116 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854167 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854241 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854310 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854339 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854356 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854399 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854436 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854521 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854475 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854570 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854593 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854721 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854894 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.854953 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.855189 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.855547 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.855770 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.855888 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.855906 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.855986 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856019 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856044 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856148 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856191 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856219 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856399 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856480 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856533 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856890 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.856948 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857026 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857096 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857214 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857229 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857349 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857517 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857620 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857633 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857741 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857798 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857591 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857916 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.857974 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.858094 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:03:57.358048101 +0000 UTC m=+21.193976725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.858184 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.858271 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.858774 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.858802 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.858853 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.859288 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.859467 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.859479 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.859536 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.859808 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.859830 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.859876 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.859911 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860010 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860034 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860295 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860392 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860401 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860467 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860595 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860627 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860669 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860468 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860489 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860582 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860676 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860704 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860960 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861010 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861050 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861087 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861124 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861163 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861195 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861236 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861278 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861314 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861353 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861384 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861417 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861454 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861488 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861520 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861555 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861594 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861638 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861676 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861707 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861741 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861801 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861838 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861872 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861927 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862478 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862547 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862749 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862815 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862858 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862889 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862927 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862961 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862989 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.860965 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861415 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863091 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861463 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861879 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861604 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.861873 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862369 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862975 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.862987 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863029 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863361 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863422 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863465 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863495 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863530 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863537 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863698 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863746 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863808 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863850 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863891 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863930 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863974 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864019 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864064 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864108 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864169 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864277 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864338 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864366 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864393 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864424 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864460 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864503 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864529 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864557 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864581 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864610 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864638 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864662 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864686 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864707 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864732 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864779 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864814 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864841 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864871 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864894 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864914 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864942 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864969 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864992 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865016 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865046 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865067 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865096 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865123 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865154 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865175 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865203 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865228 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865248 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865282 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865308 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865327 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865349 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865374 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865399 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865424 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865460 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865503 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865529 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865558 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865584 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865607 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865629 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865651 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865674 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865694 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865721 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865746 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865781 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865804 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865831 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865855 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865878 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865907 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865932 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865953 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865980 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866010 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866037 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866057 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866082 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866109 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866130 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866155 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866179 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866202 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866228 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866253 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866279 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866303 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866329 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866353 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866377 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866400 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866429 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866457 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866489 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866520 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866546 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866588 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866626 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866735 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866799 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866840 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866873 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866901 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866927 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866960 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866990 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867019 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867055 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867090 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867131 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867173 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867207 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867412 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867463 4702 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867482 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867499 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867513 4702 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867536 4702 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867550 4702 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867565 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867584 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867605 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867626 4702 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867640 4702 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867653 4702 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867671 4702 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867688 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867704 4702 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867724 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867738 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867808 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867826 4702 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867847 4702 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867860 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867874 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867888 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868302 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868382 4702 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868422 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868455 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868489 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868481 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868515 4702 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868553 4702 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868585 4702 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868615 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868632 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868649 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868668 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868682 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868699 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868714 4702 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868736 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868774 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868793 4702 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868808 4702 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868825 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868836 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868847 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868860 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868875 4702 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868886 4702 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868898 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868917 4702 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868931 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868942 4702 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868955 4702 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868972 4702 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868985 4702 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868999 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863620 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.871046 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.871342 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.871633 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.871935 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.872015 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.872031 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.872165 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863643 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863847 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.872357 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864013 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864263 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864537 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864940 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864942 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864992 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865009 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865098 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865428 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865449 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.865733 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866058 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866190 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866456 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866535 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866534 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866811 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866746 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866959 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.866964 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.867100 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868650 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868673 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.868697 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869105 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869184 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869327 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869359 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869354 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869527 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869691 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869932 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.870315 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.870402 4702 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.872745 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.872872 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:03:57.372843389 +0000 UTC m=+21.208771853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.873017 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.873534 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.873546 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.864007 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.870785 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.872355 4702 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.874042 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.874181 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.874329 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:03:57.374315872 +0000 UTC m=+21.210244336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.870647 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.874592 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.874925 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.875072 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.875199 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.875266 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.875553 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.875578 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.875842 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.875942 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.876235 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.876329 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.876533 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.863616 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.874820 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.876642 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.876902 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.876959 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.877299 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.877423 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.877714 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.879098 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.879304 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.879448 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.879620 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.880030 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.880040 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.880363 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.880565 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.880961 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.881088 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.880982 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.881240 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.881916 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.882425 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.882827 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.872246 4702 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.869528 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.883354 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.878804 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.884024 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.884204 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.884395 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.884573 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.884843 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.885054 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.885166 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.874007 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.885358 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.885434 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.885747 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.885907 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.891462 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.892166 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.892197 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.892217 4702 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.892304 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 11:03:57.392279911 +0000 UTC m=+21.228208525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.892526 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.892677 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.893191 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.893694 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.894833 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.895418 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.896481 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.898812 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.898838 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.898851 4702 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:56 crc kubenswrapper[4702]: E1203 11:03:56.898918 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 11:03:57.398903347 +0000 UTC m=+21.234832031 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.899588 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.899619 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.899882 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.900101 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.900342 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.900454 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.900880 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.901427 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.901466 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.901475 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.901889 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.901986 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.901943 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.902078 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.902487 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.902848 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.902964 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.902932 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.903072 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.903128 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.903204 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.903305 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.903353 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.903440 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.903637 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.903973 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.904161 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.904228 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.904380 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.904776 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.905948 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.920361 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.931398 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.933425 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.934206 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.936044 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.937061 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.937953 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.938123 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.938222 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.939244 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.940118 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.941268 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.941888 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.942785 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.943333 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.944420 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.944936 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.946026 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.946348 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.946659 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.947277 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.948291 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.948783 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.949816 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.950377 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.950831 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.951801 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.952243 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.953271 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.953774 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.954646 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.954942 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.955641 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.956186 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.957207 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.957735 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.958776 4702 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.958883 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.960566 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.961574 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.962236 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.963609 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.963915 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.964585 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.965901 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.966916 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.968270 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.968935 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969237 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969286 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969347 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969364 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969394 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969412 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969424 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969437 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969449 4702 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969461 4702 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969473 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969486 4702 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969498 4702 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969423 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969514 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969547 4702 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969563 4702 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969575 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969588 4702 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969599 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969612 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969624 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969638 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969650 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969662 4702 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969675 4702 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969687 4702 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969700 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969715 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969730 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969742 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969752 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969789 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969803 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969814 4702 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969828 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969840 4702 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969851 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969863 4702 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969875 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969888 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969899 4702 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969935 4702 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969949 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969962 4702 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969974 4702 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.969988 4702 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970001 4702 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970011 4702 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970023 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970034 4702 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970046 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970060 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970071 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970082 4702 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970094 4702 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970107 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970119 4702 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970129 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970138 4702 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970151 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970162 4702 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970173 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970187 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970197 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970207 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970219 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970220 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970229 4702 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970337 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970349 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970360 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970371 4702 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970382 4702 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970397 4702 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970408 4702 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970419 4702 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970431 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970443 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970454 4702 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970466 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970480 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970492 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970505 4702 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970525 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970537 4702 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970549 4702 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970560 4702 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970570 4702 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970586 4702 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970597 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970609 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970620 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970630 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970645 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970657 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970668 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970680 4702 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970691 4702 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970709 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970722 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970734 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970746 4702 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970773 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970787 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970798 4702 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970812 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970968 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970984 4702 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.970997 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971047 4702 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971059 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971070 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971082 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971095 4702 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971107 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971118 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971129 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971140 4702 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971152 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971163 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971177 4702 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971189 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971200 4702 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971210 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971221 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971233 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971244 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971255 4702 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971266 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971277 4702 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971295 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971307 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971318 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971329 4702 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971342 4702 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971354 4702 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971367 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971380 4702 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971392 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971404 4702 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971418 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.971566 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.972496 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.973534 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.974209 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.975574 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.976404 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.976567 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.976928 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.978080 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.978678 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.980198 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.981012 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.981627 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 11:03:56 crc kubenswrapper[4702]: I1203 11:03:56.986681 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.044695 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.046746 4702 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e" exitCode=255 Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.046850 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e"} Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.058545 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.060929 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.061431 4702 scope.go:117] "RemoveContainer" containerID="47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.066249 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.067776 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.074105 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.076105 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.083478 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.084384 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:57 crc kubenswrapper[4702]: W1203 11:03:57.088045 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-0ae7c740ac57bc8f6387f65281c3e965e8b7d265469798ce245cf7a0c028c21a WatchSource:0}: Error finding container 0ae7c740ac57bc8f6387f65281c3e965e8b7d265469798ce245cf7a0c028c21a: Status 404 returned error can't find the container with id 0ae7c740ac57bc8f6387f65281c3e965e8b7d265469798ce245cf7a0c028c21a Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.096750 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:57 crc kubenswrapper[4702]: W1203 11:03:57.101902 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-69ca64669d764f3438b6a2ad93aa510fe07d2f82b8abfe80e430d4aa872b53ca WatchSource:0}: Error finding container 69ca64669d764f3438b6a2ad93aa510fe07d2f82b8abfe80e430d4aa872b53ca: Status 404 returned error can't find the container with id 69ca64669d764f3438b6a2ad93aa510fe07d2f82b8abfe80e430d4aa872b53ca Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.107911 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.126003 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.404538 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.404623 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.404650 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.404679 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.404697 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.404863 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:03:58.40481533 +0000 UTC m=+22.240743784 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.404918 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.404940 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.404953 4702 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.404974 4702 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.405017 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 11:03:58.404997924 +0000 UTC m=+22.240926388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.405037 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:03:58.405029125 +0000 UTC m=+22.240957589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.405081 4702 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.405114 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:03:58.405105197 +0000 UTC m=+22.241033661 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.405086 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.405146 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.405159 4702 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.405189 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 11:03:58.405183328 +0000 UTC m=+22.241111792 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.927665 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:57 crc kubenswrapper[4702]: E1203 11:03:57.927815 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.965006 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.970583 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:03:57 crc kubenswrapper[4702]: I1203 11:03:57.979462 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.045480 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.136073 4702 csr.go:261] certificate signing request csr-lvncm is approved, waiting to be issued Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.216005 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.300991 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d24b84ba63a1a7f9baa4b35f5fae7204b7491590d2597dae045394f4a7ab66a0"} Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.302596 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c"} Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.302624 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea"} Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.302636 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"69ca64669d764f3438b6a2ad93aa510fe07d2f82b8abfe80e430d4aa872b53ca"} Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.306607 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075"} Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.306641 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0ae7c740ac57bc8f6387f65281c3e965e8b7d265469798ce245cf7a0c028c21a"} Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.308241 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.309630 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8"} Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.369283 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.398267 4702 csr.go:257] certificate signing request csr-lvncm is issued Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.435789 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.451791 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.451870 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.451916 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.451951 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.451988 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452083 4702 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452134 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:00.452119761 +0000 UTC m=+24.288048215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452472 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452489 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452500 4702 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452525 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:00.45251761 +0000 UTC m=+24.288446074 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452872 4702 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452933 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452953 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452966 4702 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.452965 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:00.45294176 +0000 UTC m=+24.288870294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.453014 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:00.453002451 +0000 UTC m=+24.288930985 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.453102 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:00.453088613 +0000 UTC m=+24.289017147 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.567095 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.594902 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.615890 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.631717 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.646927 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.684169 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.714242 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.733419 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.753063 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.775221 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.793905 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.806597 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.824124 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:58Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.928696 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.928848 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:03:58 crc kubenswrapper[4702]: I1203 11:03:58.928910 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:03:58 crc kubenswrapper[4702]: E1203 11:03:58.928961 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.140421 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-b7lmv"] Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.140821 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b7lmv" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.146916 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.147089 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.153842 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.184317 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.212730 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.226316 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.229703 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/418b87ff-9c02-4b43-9bc3-3ce38c1df3a1-hosts-file\") pod \"node-resolver-b7lmv\" (UID: \"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\") " pod="openshift-dns/node-resolver-b7lmv" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.229800 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sbgn\" (UniqueName: \"kubernetes.io/projected/418b87ff-9c02-4b43-9bc3-3ce38c1df3a1-kube-api-access-9sbgn\") pod \"node-resolver-b7lmv\" (UID: \"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\") " pod="openshift-dns/node-resolver-b7lmv" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.242619 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.257488 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.276876 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.289649 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.306069 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.311933 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.320954 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.330695 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/418b87ff-9c02-4b43-9bc3-3ce38c1df3a1-hosts-file\") pod \"node-resolver-b7lmv\" (UID: \"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\") " pod="openshift-dns/node-resolver-b7lmv" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.330793 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sbgn\" (UniqueName: \"kubernetes.io/projected/418b87ff-9c02-4b43-9bc3-3ce38c1df3a1-kube-api-access-9sbgn\") pod \"node-resolver-b7lmv\" (UID: \"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\") " pod="openshift-dns/node-resolver-b7lmv" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.330855 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/418b87ff-9c02-4b43-9bc3-3ce38c1df3a1-hosts-file\") pod \"node-resolver-b7lmv\" (UID: \"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\") " pod="openshift-dns/node-resolver-b7lmv" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.341574 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.355002 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sbgn\" (UniqueName: \"kubernetes.io/projected/418b87ff-9c02-4b43-9bc3-3ce38c1df3a1-kube-api-access-9sbgn\") pod \"node-resolver-b7lmv\" (UID: \"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\") " pod="openshift-dns/node-resolver-b7lmv" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.399568 4702 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-12-03 10:58:58 +0000 UTC, rotation deadline is 2026-09-11 03:45:39.244474553 +0000 UTC Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.399621 4702 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6760h41m39.844856833s for next certificate rotation Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.454887 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b7lmv" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.574946 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pqn7q"] Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.575375 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.575444 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-z8lld"] Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.576451 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.579175 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qf5sd"] Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.579788 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: W1203 11:03:59.581734 4702 reflector.go:561] object-"openshift-multus"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Dec 03 11:03:59 crc kubenswrapper[4702]: E1203 11:03:59.581813 4702 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 11:03:59 crc kubenswrapper[4702]: W1203 11:03:59.581738 4702 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Dec 03 11:03:59 crc kubenswrapper[4702]: E1203 11:03:59.581854 4702 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 11:03:59 crc kubenswrapper[4702]: W1203 11:03:59.582612 4702 reflector.go:561] object-"openshift-multus"/"cni-copy-resources": failed to list *v1.ConfigMap: configmaps "cni-copy-resources" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.582731 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.582736 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 11:03:59 crc kubenswrapper[4702]: E1203 11:03:59.582956 4702 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"cni-copy-resources\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"cni-copy-resources\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.583159 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.583593 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.587203 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.589006 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.592994 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.593002 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.593581 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633353 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-system-cni-dir\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633413 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2e03cb6-21dc-460c-a68e-17aafd79e258-proxy-tls\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633443 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-socket-dir-parent\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633465 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72be0494-b56e-4d46-8300-decd11c66d66-multus-daemon-config\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633487 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2zm\" (UniqueName: \"kubernetes.io/projected/d2e03cb6-21dc-460c-a68e-17aafd79e258-kube-api-access-jp2zm\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633512 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72be0494-b56e-4d46-8300-decd11c66d66-cni-binary-copy\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633533 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-cni-multus\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633559 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-os-release\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633583 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-hostroot\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633601 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vq2s\" (UniqueName: \"kubernetes.io/projected/72be0494-b56e-4d46-8300-decd11c66d66-kube-api-access-6vq2s\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633622 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-cni-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633642 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-netns\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633690 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-cnibin\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633711 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d2e03cb6-21dc-460c-a68e-17aafd79e258-rootfs\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633727 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf6j8\" (UniqueName: \"kubernetes.io/projected/0bdf4071-59bc-4d40-80ee-20027ce42805-kube-api-access-cf6j8\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633745 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-os-release\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633800 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-k8s-cni-cncf-io\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633824 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-system-cni-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.633878 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634047 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2e03cb6-21dc-460c-a68e-17aafd79e258-mcd-auth-proxy-config\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634122 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-kubelet\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634186 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-multus-certs\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634250 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-conf-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634313 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-etc-kubernetes\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634356 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-cnibin\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634380 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bdf4071-59bc-4d40-80ee-20027ce42805-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634424 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0bdf4071-59bc-4d40-80ee-20027ce42805-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.634463 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-cni-bin\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.687574 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735023 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735546 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-system-cni-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735582 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-kubelet\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735601 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2e03cb6-21dc-460c-a68e-17aafd79e258-mcd-auth-proxy-config\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735628 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-multus-certs\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735649 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-conf-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735664 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-etc-kubernetes\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735680 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-cnibin\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735698 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bdf4071-59bc-4d40-80ee-20027ce42805-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735775 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735842 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-multus-certs\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735890 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-conf-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735774 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-system-cni-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735786 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0bdf4071-59bc-4d40-80ee-20027ce42805-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735938 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-kubelet\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.735971 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-cni-bin\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736004 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2e03cb6-21dc-460c-a68e-17aafd79e258-proxy-tls\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736031 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-system-cni-dir\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736038 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-cnibin\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736060 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-socket-dir-parent\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736107 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-socket-dir-parent\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736145 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72be0494-b56e-4d46-8300-decd11c66d66-multus-daemon-config\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736175 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2zm\" (UniqueName: \"kubernetes.io/projected/d2e03cb6-21dc-460c-a68e-17aafd79e258-kube-api-access-jp2zm\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736207 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72be0494-b56e-4d46-8300-decd11c66d66-cni-binary-copy\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736212 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-cni-bin\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736232 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-cni-multus\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736256 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-os-release\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736285 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vq2s\" (UniqueName: \"kubernetes.io/projected/72be0494-b56e-4d46-8300-decd11c66d66-kube-api-access-6vq2s\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736305 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-cni-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736335 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-netns\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736357 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-hostroot\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736374 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-system-cni-dir\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736453 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-cnibin\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736481 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d2e03cb6-21dc-460c-a68e-17aafd79e258-rootfs\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736529 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-os-release\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736554 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf6j8\" (UniqueName: \"kubernetes.io/projected/0bdf4071-59bc-4d40-80ee-20027ce42805-kube-api-access-cf6j8\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736558 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-hostroot\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736449 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-var-lib-cni-multus\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736607 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-etc-kubernetes\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736636 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-multus-cni-dir\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736614 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0bdf4071-59bc-4d40-80ee-20027ce42805-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736639 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-k8s-cni-cncf-io\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736594 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-k8s-cni-cncf-io\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736667 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-host-run-netns\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736749 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d2e03cb6-21dc-460c-a68e-17aafd79e258-rootfs\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736785 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0bdf4071-59bc-4d40-80ee-20027ce42805-os-release\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736801 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-cnibin\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.736805 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72be0494-b56e-4d46-8300-decd11c66d66-os-release\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.738632 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72be0494-b56e-4d46-8300-decd11c66d66-multus-daemon-config\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.739014 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d2e03cb6-21dc-460c-a68e-17aafd79e258-mcd-auth-proxy-config\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.751039 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d2e03cb6-21dc-460c-a68e-17aafd79e258-proxy-tls\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.754028 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.768560 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2zm\" (UniqueName: \"kubernetes.io/projected/d2e03cb6-21dc-460c-a68e-17aafd79e258-kube-api-access-jp2zm\") pod \"machine-config-daemon-qf5sd\" (UID: \"d2e03cb6-21dc-460c-a68e-17aafd79e258\") " pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.775045 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.800666 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.817240 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.836384 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.858176 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.872495 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.888181 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.922578 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.927191 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:03:59 crc kubenswrapper[4702]: E1203 11:03:59.927317 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.932382 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:03:59 crc kubenswrapper[4702]: W1203 11:03:59.943802 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2e03cb6_21dc_460c_a68e_17aafd79e258.slice/crio-71e2aa34ca991d9812ae4ef54b9eb85955934adf72518d371b1782c17f7829c5 WatchSource:0}: Error finding container 71e2aa34ca991d9812ae4ef54b9eb85955934adf72518d371b1782c17f7829c5: Status 404 returned error can't find the container with id 71e2aa34ca991d9812ae4ef54b9eb85955934adf72518d371b1782c17f7829c5 Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.947311 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.960003 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.973402 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:03:59 crc kubenswrapper[4702]: I1203 11:03:59.993193 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:03:59Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.009112 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.026581 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.038979 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.052635 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.064203 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mt92m"] Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.065322 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.067717 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.067957 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.067967 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.068560 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.068580 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.068589 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.068567 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.075149 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.133377 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.139991 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-systemd\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140046 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-script-lib\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140097 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88lx4\" (UniqueName: \"kubernetes.io/projected/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-kube-api-access-88lx4\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140132 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-etc-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140203 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-kubelet\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140249 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-ovn\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140285 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-netns\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140315 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140334 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-bin\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140352 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-slash\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140395 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-config\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140421 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-var-lib-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140438 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-log-socket\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140467 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-env-overrides\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140482 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovn-node-metrics-cert\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140512 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140554 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-systemd-units\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140585 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140644 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-node-log\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.140663 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-netd\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.159150 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.184683 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.204165 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.217731 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.234311 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241282 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-node-log\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241336 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-netd\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241362 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-script-lib\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241379 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88lx4\" (UniqueName: \"kubernetes.io/projected/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-kube-api-access-88lx4\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241416 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-systemd\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241432 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-etc-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241447 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-kubelet\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241464 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-ovn\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241485 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-netns\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241504 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241518 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-bin\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241538 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-config\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241556 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-slash\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241571 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-var-lib-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241584 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-log-socket\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241604 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-env-overrides\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241620 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovn-node-metrics-cert\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241642 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241659 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241676 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-systemd-units\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241769 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-systemd-units\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241811 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-node-log\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.241832 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-netd\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242446 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-bin\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242514 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-log-socket\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242560 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-slash\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242637 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-script-lib\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242713 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-systemd\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242744 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-etc-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242788 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-kubelet\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242814 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-ovn\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242835 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242861 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-ovn-kubernetes\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242876 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-netns\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242914 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.243174 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-env-overrides\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.243741 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-config\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.242612 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-var-lib-openvswitch\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.246035 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovn-node-metrics-cert\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.258956 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88lx4\" (UniqueName: \"kubernetes.io/projected/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-kube-api-access-88lx4\") pod \"ovnkube-node-mt92m\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.266130 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.294907 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.307926 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.316091 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b7lmv" event={"ID":"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1","Type":"ContainerStarted","Data":"ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e"} Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.316173 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b7lmv" event={"ID":"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1","Type":"ContainerStarted","Data":"a69b80b61469bc59b07c9bbcaeb7d1853325c11b2e49039cd0ff6e7d9e24196b"} Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.317734 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9"} Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.317803 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4"} Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.317825 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"71e2aa34ca991d9812ae4ef54b9eb85955934adf72518d371b1782c17f7829c5"} Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.323553 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.379278 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.394969 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: W1203 11:04:00.397953 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa7620d_1ec2_4a53_ad2e_df64bb9aeac3.slice/crio-00be1d4251a12dcc580afeaeedac98ebbb9c6320647fe866bcb421f86823bccd WatchSource:0}: Error finding container 00be1d4251a12dcc580afeaeedac98ebbb9c6320647fe866bcb421f86823bccd: Status 404 returned error can't find the container with id 00be1d4251a12dcc580afeaeedac98ebbb9c6320647fe866bcb421f86823bccd Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.415614 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.417619 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0bdf4071-59bc-4d40-80ee-20027ce42805-cni-binary-copy\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.418486 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72be0494-b56e-4d46-8300-decd11c66d66-cni-binary-copy\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.567169 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.567274 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.567322 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567355 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:04.567328965 +0000 UTC m=+28.403257429 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.567404 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567443 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567464 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.567465 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567476 4702 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567528 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:04.56751204 +0000 UTC m=+28.403440504 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567546 4702 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567619 4702 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567673 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:04.567644434 +0000 UTC m=+28.403572898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567711 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567788 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567800 4702 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567706 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:04.567695095 +0000 UTC m=+28.403623559 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.567844 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:04.567836449 +0000 UTC m=+28.403764913 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.624944 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.641831 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vq2s\" (UniqueName: \"kubernetes.io/projected/72be0494-b56e-4d46-8300-decd11c66d66-kube-api-access-6vq2s\") pod \"multus-pqn7q\" (UID: \"72be0494-b56e-4d46-8300-decd11c66d66\") " pod="openshift-multus/multus-pqn7q" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.679459 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf6j8\" (UniqueName: \"kubernetes.io/projected/0bdf4071-59bc-4d40-80ee-20027ce42805-kube-api-access-cf6j8\") pod \"multus-additional-cni-plugins-z8lld\" (UID: \"0bdf4071-59bc-4d40-80ee-20027ce42805\") " pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.691138 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.741097 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.759644 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.784683 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.845276 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.860524 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.876032 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.878010 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z8lld" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.878327 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pqn7q" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.889967 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.912091 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.929339 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.930103 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.930776 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:00 crc kubenswrapper[4702]: E1203 11:04:00.930923 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.937066 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.968144 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.981478 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:00 crc kubenswrapper[4702]: I1203 11:04:00.996855 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:00Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.011592 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.059431 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.082950 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.096859 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.109904 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.121124 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.146008 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.170746 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.191687 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.207402 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.326366 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pqn7q" event={"ID":"72be0494-b56e-4d46-8300-decd11c66d66","Type":"ContainerStarted","Data":"4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e"} Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.326432 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pqn7q" event={"ID":"72be0494-b56e-4d46-8300-decd11c66d66","Type":"ContainerStarted","Data":"193d804db6a81162c0bb0fde53d06747a943dbea3c2c09a3406c8b1055fa9dc1"} Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.329076 4702 generic.go:334] "Generic (PLEG): container finished" podID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerID="3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b" exitCode=0 Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.329542 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b"} Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.329580 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"00be1d4251a12dcc580afeaeedac98ebbb9c6320647fe866bcb421f86823bccd"} Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.346138 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.361018 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.377360 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.388662 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerStarted","Data":"b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039"} Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.388704 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerStarted","Data":"7b75fe5607795a1372b86c777d7697043afd0ddcdd5f9fd2cfb07b070912b69a"} Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.404939 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.469788 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.484060 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.506691 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.563428 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.576612 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.712343 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.803160 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.889998 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:01Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:01 crc kubenswrapper[4702]: I1203 11:04:01.960430 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:01 crc kubenswrapper[4702]: E1203 11:04:01.960650 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.058231 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.073025 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.075478 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-lcdkx"] Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.076093 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.077839 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.078622 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.078640 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.078986 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.088775 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.103895 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.121880 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.138042 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.182071 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.192474 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/14795f18-bfe4-4ea9-b2a7-329e83234c68-serviceca\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.192519 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14795f18-bfe4-4ea9-b2a7-329e83234c68-host\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.192562 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4mwm\" (UniqueName: \"kubernetes.io/projected/14795f18-bfe4-4ea9-b2a7-329e83234c68-kube-api-access-k4mwm\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.203200 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.216723 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.226810 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.245716 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.261049 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.273903 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.286016 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.293478 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/14795f18-bfe4-4ea9-b2a7-329e83234c68-serviceca\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.293522 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14795f18-bfe4-4ea9-b2a7-329e83234c68-host\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.293557 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4mwm\" (UniqueName: \"kubernetes.io/projected/14795f18-bfe4-4ea9-b2a7-329e83234c68-kube-api-access-k4mwm\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.293671 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/14795f18-bfe4-4ea9-b2a7-329e83234c68-host\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.294586 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/14795f18-bfe4-4ea9-b2a7-329e83234c68-serviceca\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.297432 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.310999 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4mwm\" (UniqueName: \"kubernetes.io/projected/14795f18-bfe4-4ea9-b2a7-329e83234c68-kube-api-access-k4mwm\") pod \"node-ca-lcdkx\" (UID: \"14795f18-bfe4-4ea9-b2a7-329e83234c68\") " pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.316120 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.329230 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.345005 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.357442 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.379741 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.398623 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.399342 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647"} Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.400706 4702 generic.go:334] "Generic (PLEG): container finished" podID="0bdf4071-59bc-4d40-80ee-20027ce42805" containerID="b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039" exitCode=0 Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.400732 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerDied","Data":"b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039"} Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.429042 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.447183 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.459921 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.473683 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.489598 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.507110 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.528829 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.541813 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.549642 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-lcdkx" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.569587 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.585221 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.615499 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.634209 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.645499 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.660808 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.674071 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.691296 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.704710 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.715588 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.735228 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.750494 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.753422 4702 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.756307 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.756338 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.756346 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.756648 4702 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.766092 4702 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.766465 4702 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.769446 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.769477 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.769489 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.769506 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.769524 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:02Z","lastTransitionTime":"2025-12-03T11:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.773636 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: E1203 11:04:02.791783 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.794407 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.796096 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.796131 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.796143 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.796160 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.796170 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:02Z","lastTransitionTime":"2025-12-03T11:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.809519 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: E1203 11:04:02.813280 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.819538 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.819597 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.819611 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.819633 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.819646 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:02Z","lastTransitionTime":"2025-12-03T11:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.834612 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: E1203 11:04:02.837239 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.846499 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.846592 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.846604 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.846622 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.846638 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:02Z","lastTransitionTime":"2025-12-03T11:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.856290 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: E1203 11:04:02.867234 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.873001 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.873058 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.873073 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.873096 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.873109 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:02Z","lastTransitionTime":"2025-12-03T11:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:02 crc kubenswrapper[4702]: E1203 11:04:02.884911 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:02Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:02 crc kubenswrapper[4702]: E1203 11:04:02.885039 4702 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.886862 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.886900 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.886911 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.886929 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.886939 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:02Z","lastTransitionTime":"2025-12-03T11:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.927344 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.927410 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:02 crc kubenswrapper[4702]: E1203 11:04:02.927518 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:02 crc kubenswrapper[4702]: E1203 11:04:02.927682 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.990003 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.990072 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.990086 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.990118 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:02 crc kubenswrapper[4702]: I1203 11:04:02.990133 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:02Z","lastTransitionTime":"2025-12-03T11:04:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.101017 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.101367 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.101377 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.101392 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.101401 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.211917 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.212001 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.212011 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.212035 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.212049 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.331446 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.331474 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.331482 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.331496 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.331504 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.409147 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.409198 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.409208 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.409218 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.409226 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.410741 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lcdkx" event={"ID":"14795f18-bfe4-4ea9-b2a7-329e83234c68","Type":"ContainerStarted","Data":"38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.410820 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-lcdkx" event={"ID":"14795f18-bfe4-4ea9-b2a7-329e83234c68","Type":"ContainerStarted","Data":"3f2120e076a357e908e512ee8999e099d71d1ea09e9e268784f62d6ca5f92032"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.413261 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerStarted","Data":"6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.423483 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.433833 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.433868 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.433878 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.433893 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.433903 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.445996 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.462199 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.475367 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.493064 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.520201 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.537540 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.537579 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.537591 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.537608 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.537621 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.538314 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.555940 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.571034 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.590275 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.608313 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.623153 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.640151 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.640183 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.640192 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.640208 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.640218 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.647975 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.662830 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.673557 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.705269 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.723263 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.735610 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.742380 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.742438 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.742450 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.742471 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.742487 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.751051 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.768059 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.783279 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.798717 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.811145 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.831367 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.844651 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.844895 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.844932 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.844944 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.844968 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.844981 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.860453 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.874121 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.887588 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.903206 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.915827 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:03Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.928098 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:03 crc kubenswrapper[4702]: E1203 11:04:03.928265 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.948054 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.948106 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.948119 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.948137 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:03 crc kubenswrapper[4702]: I1203 11:04:03.948150 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:03Z","lastTransitionTime":"2025-12-03T11:04:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.051039 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.051095 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.051105 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.051127 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.051137 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.153394 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.153437 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.153445 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.153459 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.153468 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.256497 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.256558 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.256571 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.256590 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.256603 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.359071 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.359149 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.359171 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.359199 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.359221 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.419433 4702 generic.go:334] "Generic (PLEG): container finished" podID="0bdf4071-59bc-4d40-80ee-20027ce42805" containerID="6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912" exitCode=0 Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.419528 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerDied","Data":"6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.421226 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.436681 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.453238 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.462182 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.462241 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.462252 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.462284 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.462316 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.470640 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.484664 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.502154 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.521357 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.531829 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.544869 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.558295 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.565314 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.565349 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.565361 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.565378 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.565389 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.573040 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.587590 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.607351 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.628509 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.643535 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.643724 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.643769 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.643792 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.643812 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.643895 4702 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.643952 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:12.643935756 +0000 UTC m=+36.479864220 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644026 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:12.644018578 +0000 UTC m=+36.479947042 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644111 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644122 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644134 4702 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644156 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:12.644150302 +0000 UTC m=+36.480078766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644196 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644206 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644212 4702 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644230 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:12.644225034 +0000 UTC m=+36.480153498 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644274 4702 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.644294 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:12.644288836 +0000 UTC m=+36.480217300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.645007 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.654840 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.667033 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.669672 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.669733 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.669743 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.669788 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.669800 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.679284 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.697510 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.716012 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.732052 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.751679 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.770864 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.772680 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.772829 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.772924 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.773023 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.773119 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.821903 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.854176 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.875914 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.875944 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.875953 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.875968 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.875977 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.894692 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.927474 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.927690 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.927506 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:04 crc kubenswrapper[4702]: E1203 11:04:04.927859 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.935701 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.974808 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:04Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.978876 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.978936 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.978948 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.978967 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:04 crc kubenswrapper[4702]: I1203 11:04:04.978996 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:04Z","lastTransitionTime":"2025-12-03T11:04:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.017154 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.049438 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.082398 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.082429 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.082440 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.082457 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.082469 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.094163 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.184392 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.184447 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.184461 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.184481 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.184494 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.287363 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.287427 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.287441 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.287463 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.287486 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.390236 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.390294 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.390323 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.390345 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.390361 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.428631 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.430294 4702 generic.go:334] "Generic (PLEG): container finished" podID="0bdf4071-59bc-4d40-80ee-20027ce42805" containerID="6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437" exitCode=0 Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.430333 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerDied","Data":"6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.449819 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.582764 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.582803 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.582811 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.582838 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.582849 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.584130 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.632340 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.660077 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.675165 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.685880 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.685916 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.685927 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.685944 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.685954 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.690091 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.705405 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.723054 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.739255 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.753993 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.768115 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.785951 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.788344 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.788395 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.788410 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.788438 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.788452 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.807437 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.822585 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.835485 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:05Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.891131 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.891179 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.891192 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.891214 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.891229 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.927737 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:05 crc kubenswrapper[4702]: E1203 11:04:05.927916 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.994486 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.994540 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.994551 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.994569 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:05 crc kubenswrapper[4702]: I1203 11:04:05.994582 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:05Z","lastTransitionTime":"2025-12-03T11:04:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.097659 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.097750 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.097780 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.097807 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.097820 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:06Z","lastTransitionTime":"2025-12-03T11:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.200521 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.200572 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.200583 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.200600 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.200610 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:06Z","lastTransitionTime":"2025-12-03T11:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.304110 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.304171 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.304185 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.304209 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.304224 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:06Z","lastTransitionTime":"2025-12-03T11:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.407327 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.407387 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.407403 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.407461 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.407504 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:06Z","lastTransitionTime":"2025-12-03T11:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.437494 4702 generic.go:334] "Generic (PLEG): container finished" podID="0bdf4071-59bc-4d40-80ee-20027ce42805" containerID="f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838" exitCode=0 Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.437564 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerDied","Data":"f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.461731 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.475161 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.500595 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.509478 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.509528 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.509540 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.509558 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.509571 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:06Z","lastTransitionTime":"2025-12-03T11:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.517976 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.530937 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.543180 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.554949 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.585444 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.595900 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.606308 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.612361 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.612432 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.612445 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.612466 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.612481 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:06Z","lastTransitionTime":"2025-12-03T11:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.622374 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.639427 4702 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.898210 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.898258 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.898268 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.898284 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.898295 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:06Z","lastTransitionTime":"2025-12-03T11:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.905270 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.919016 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.927233 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.927287 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:06 crc kubenswrapper[4702]: E1203 11:04:06.927401 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:06 crc kubenswrapper[4702]: E1203 11:04:06.927562 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.937622 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.952275 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.966093 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.984886 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:06 crc kubenswrapper[4702]: I1203 11:04:06.999150 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:06Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.001010 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.001119 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.001137 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.001858 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.002154 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.022218 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.037596 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.050520 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.064560 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.076424 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.099274 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.104585 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.104635 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.104647 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.104665 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.104677 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.108829 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.119987 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.132272 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.144823 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.158767 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.176179 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.207291 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.207331 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.207340 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.207355 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.207367 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.310099 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.310147 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.310157 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.310172 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.310181 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.413882 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.413931 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.413942 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.413959 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.413972 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.444620 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerStarted","Data":"84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.462120 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.486333 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.502180 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.515336 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.517161 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.517206 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.517233 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.517250 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.517261 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.533447 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.546409 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.560590 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.583401 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.595004 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.610683 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.619949 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.619982 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.619991 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.620007 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.620016 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.626882 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.644282 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.661184 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.674665 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.687951 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:07Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.722530 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.722588 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.722604 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.722626 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.722640 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.825945 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.825996 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.826012 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.826034 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.826055 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.927208 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:07 crc kubenswrapper[4702]: E1203 11:04:07.927366 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.929093 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.929135 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.929159 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.929186 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:07 crc kubenswrapper[4702]: I1203 11:04:07.929199 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:07Z","lastTransitionTime":"2025-12-03T11:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.032348 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.032385 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.032395 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.032410 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.032420 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.134602 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.134642 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.134654 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.134689 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.134703 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.237731 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.237794 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.237804 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.237821 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.237839 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.341007 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.341046 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.341057 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.341073 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.341086 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.444231 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.444282 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.444296 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.444327 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.444350 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.454237 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.454973 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.476562 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.488772 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.508706 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.525643 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.543338 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.548506 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.548583 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.548595 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.548616 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.548631 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.560334 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.562460 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.577966 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.595241 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.621545 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.633985 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.647898 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.652192 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.652245 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.652255 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.652272 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.652282 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.662464 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.680992 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.697700 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.712393 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.730167 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.747483 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.764115 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.779489 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.792116 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.804930 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.804977 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.804985 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.805001 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.805011 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.811164 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.822470 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.833727 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.845969 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.856325 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.867606 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.879841 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.897117 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.907837 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.907920 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.907936 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.907960 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.907973 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:08Z","lastTransitionTime":"2025-12-03T11:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.911446 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.921577 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:08Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.927867 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:08 crc kubenswrapper[4702]: I1203 11:04:08.927962 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:08 crc kubenswrapper[4702]: E1203 11:04:08.927976 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:08 crc kubenswrapper[4702]: E1203 11:04:08.928178 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.010933 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.010977 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.010985 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.011001 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.011011 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.113145 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.113192 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.113201 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.113215 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.113225 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.215523 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.215604 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.215626 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.215659 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.215683 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.382337 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.382399 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.382413 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.382436 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.382449 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.463177 4702 generic.go:334] "Generic (PLEG): container finished" podID="0bdf4071-59bc-4d40-80ee-20027ce42805" containerID="84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91" exitCode=0 Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.463417 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.463882 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerDied","Data":"84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.464600 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.482582 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.487419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.487442 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.487450 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.487464 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.487473 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.504108 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.506597 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.524467 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.537526 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.548997 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.563636 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.575879 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.590937 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.591223 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.591236 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.591253 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.591266 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.595654 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.610887 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.622941 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.638652 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.651872 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.668821 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.683178 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.694658 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.694712 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.694722 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.694951 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.694966 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.698298 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.712220 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.735097 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.750653 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.760725 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.771901 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.789465 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.798072 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.798291 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.798314 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.798323 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.798336 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.798345 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.808322 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.820166 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.831671 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.843424 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.855006 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.864785 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.875787 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.896630 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:09Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.900366 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.900424 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.900439 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.900482 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.900500 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:09Z","lastTransitionTime":"2025-12-03T11:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:09 crc kubenswrapper[4702]: I1203 11:04:09.927792 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:09 crc kubenswrapper[4702]: E1203 11:04:09.927952 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.003819 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.003896 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.003919 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.003945 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.003967 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.107746 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.107863 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.107897 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.107939 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.107962 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.211252 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.211307 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.211322 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.211341 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.211354 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.314890 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.314963 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.314986 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.315015 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.315035 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.418172 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.418222 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.418236 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.418260 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.418273 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.472004 4702 generic.go:334] "Generic (PLEG): container finished" podID="0bdf4071-59bc-4d40-80ee-20027ce42805" containerID="c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201" exitCode=0 Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.472225 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.472683 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerDied","Data":"c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.506492 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.522084 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.522141 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.522152 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.522173 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.522187 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.526833 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.544368 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.557690 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.576403 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.591989 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.607177 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.620108 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.624653 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.624678 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.624690 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.624707 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.624719 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.647805 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.659790 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.672501 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.683111 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.700112 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.727713 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.727779 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.727794 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.727812 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.727825 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.733820 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.766732 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:10Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.830836 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.830873 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.830883 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.830896 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.830907 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.927151 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.927240 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:10 crc kubenswrapper[4702]: E1203 11:04:10.927295 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:10 crc kubenswrapper[4702]: E1203 11:04:10.927387 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.933071 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.933097 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.933108 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.933122 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:10 crc kubenswrapper[4702]: I1203 11:04:10.933131 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:10Z","lastTransitionTime":"2025-12-03T11:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.035751 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.035808 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.035818 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.035833 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.035843 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.138795 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.138843 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.138857 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.138874 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.138885 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.241484 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.241520 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.241532 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.241547 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.241557 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.344804 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.344869 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.344884 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.344910 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.344927 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.448854 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.448895 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.448905 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.448922 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.448931 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.477468 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/0.log" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.481908 4702 generic.go:334] "Generic (PLEG): container finished" podID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerID="4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70" exitCode=1 Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.482013 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.483016 4702 scope.go:117] "RemoveContainer" containerID="4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.486612 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" event={"ID":"0bdf4071-59bc-4d40-80ee-20027ce42805","Type":"ContainerStarted","Data":"6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.502039 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.529674 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.548385 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.553982 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.554222 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.554337 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.554453 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.554564 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.559664 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.576345 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.597735 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.614182 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.625317 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.636948 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.650002 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.657032 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.657071 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.657086 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.657114 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.657128 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.661258 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.685082 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.696060 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.711278 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.727258 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.743072 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.758726 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.759465 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.759507 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.759517 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.759532 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.759542 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.770159 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.786437 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.798316 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.811413 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.827046 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.838737 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.852405 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.862460 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.862509 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.862522 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.862543 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.862562 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.870821 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.883078 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.916428 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.927611 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:11 crc kubenswrapper[4702]: E1203 11:04:11.927821 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.946838 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.961793 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.965677 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.965719 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.965732 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.965770 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.965786 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:11Z","lastTransitionTime":"2025-12-03T11:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:11 crc kubenswrapper[4702]: I1203 11:04:11.971145 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:11Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.067912 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.067963 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.067974 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.067990 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.068002 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.170251 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.170296 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.170306 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.170322 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.170332 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.272384 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.272425 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.272435 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.272451 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.272462 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.374730 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.374794 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.374806 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.374821 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.374831 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.478007 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.478076 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.478090 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.478111 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.478125 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.491994 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/0.log" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.495079 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.495364 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.509266 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.523592 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.537745 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.555328 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.568729 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.580452 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.580498 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.580510 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.580527 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.580538 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.586441 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.599488 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.607345 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.627022 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.638325 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.653718 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.666672 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.679722 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.683067 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.683118 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.683128 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.683145 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.683157 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.695575 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.708046 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.711674 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.711841 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:28.711814288 +0000 UTC m=+52.547742752 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.712082 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.712149 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.712184 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.712214 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712326 4702 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712333 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712362 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712374 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:28.712363704 +0000 UTC m=+52.548292178 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712382 4702 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712404 4702 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712433 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:28.712416675 +0000 UTC m=+52.548345159 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712471 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:28.712451876 +0000 UTC m=+52.548380360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712547 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712569 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712581 4702 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.712626 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 11:04:28.712612911 +0000 UTC m=+52.548541395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.712744 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz"] Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.713513 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.715675 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.716891 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.727156 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.745269 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.761948 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.771279 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.779432 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.785462 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.785497 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.785507 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.785523 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.785535 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.790851 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.802942 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.812978 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.813110 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84f4e3f0-5001-4730-a1d4-64407794e5a8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.813156 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84f4e3f0-5001-4730-a1d4-64407794e5a8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.813180 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84f4e3f0-5001-4730-a1d4-64407794e5a8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.813194 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qchb9\" (UniqueName: \"kubernetes.io/projected/84f4e3f0-5001-4730-a1d4-64407794e5a8-kube-api-access-qchb9\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.825361 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.838263 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.863155 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.876411 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.887551 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.887607 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.887622 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.887643 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.887657 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.890996 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.909269 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.914261 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84f4e3f0-5001-4730-a1d4-64407794e5a8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.914317 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84f4e3f0-5001-4730-a1d4-64407794e5a8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.914346 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qchb9\" (UniqueName: \"kubernetes.io/projected/84f4e3f0-5001-4730-a1d4-64407794e5a8-kube-api-access-qchb9\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.914364 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84f4e3f0-5001-4730-a1d4-64407794e5a8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.915143 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/84f4e3f0-5001-4730-a1d4-64407794e5a8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.915340 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/84f4e3f0-5001-4730-a1d4-64407794e5a8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.921156 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/84f4e3f0-5001-4730-a1d4-64407794e5a8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.928083 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.928118 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.928253 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:12 crc kubenswrapper[4702]: E1203 11:04:12.928425 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.934989 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.958241 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qchb9\" (UniqueName: \"kubernetes.io/projected/84f4e3f0-5001-4730-a1d4-64407794e5a8-kube-api-access-qchb9\") pod \"ovnkube-control-plane-749d76644c-k9mzz\" (UID: \"84f4e3f0-5001-4730-a1d4-64407794e5a8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.990477 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.990525 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.990533 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.990548 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.990558 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:12Z","lastTransitionTime":"2025-12-03T11:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:12 crc kubenswrapper[4702]: I1203 11:04:12.994643 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.027028 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.094092 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.094134 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.094146 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.094165 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.094175 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.162150 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.162193 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.162204 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.162219 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.162230 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: E1203 11:04:13.175400 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:13Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.178506 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.178540 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.178548 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.178563 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.178575 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: E1203 11:04:13.191786 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:13Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.195406 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.195445 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.195457 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.195475 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.195490 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: E1203 11:04:13.207432 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:13Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.210909 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.210954 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.210964 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.210981 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.210990 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: E1203 11:04:13.222363 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:13Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.226069 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.226107 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.226122 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.226141 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.226154 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: E1203 11:04:13.236859 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:13Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:13 crc kubenswrapper[4702]: E1203 11:04:13.237020 4702 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.238530 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.238574 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.238585 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.238603 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.238616 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.341643 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.341706 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.341723 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.341745 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.341791 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.444202 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.444255 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.444272 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.444295 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.444314 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.500061 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" event={"ID":"84f4e3f0-5001-4730-a1d4-64407794e5a8","Type":"ContainerStarted","Data":"44bf792815b6dcdd567211c6828a326219ddce478842fccb4230c667d273af4e"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.500261 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.548146 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.548195 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.548204 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.548218 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.548228 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.650345 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.650384 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.650396 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.650413 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.650425 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.753376 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.753452 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.753477 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.753509 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.753533 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.856438 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.856508 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.856522 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.856544 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.856560 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.927962 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:13 crc kubenswrapper[4702]: E1203 11:04:13.928124 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.958818 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.958864 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.958873 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.958887 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:13 crc kubenswrapper[4702]: I1203 11:04:13.958897 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:13Z","lastTransitionTime":"2025-12-03T11:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.062192 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.062274 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.062309 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.062336 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.062354 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.166005 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.166070 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.166088 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.166111 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.166129 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.168456 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-6jzjr"] Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.169014 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:14 crc kubenswrapper[4702]: E1203 11:04:14.169105 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.201364 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.223930 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.231031 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.231090 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrfsd\" (UniqueName: \"kubernetes.io/projected/11bb1bad-4b90-4366-9187-8d27480f670b-kube-api-access-qrfsd\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.236005 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.250631 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.265527 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.268277 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.268334 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.268348 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.268366 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.268379 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.282892 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.299104 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.318226 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.331945 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.332231 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.332285 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrfsd\" (UniqueName: \"kubernetes.io/projected/11bb1bad-4b90-4366-9187-8d27480f670b-kube-api-access-qrfsd\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:14 crc kubenswrapper[4702]: E1203 11:04:14.332431 4702 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:14 crc kubenswrapper[4702]: E1203 11:04:14.332512 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs podName:11bb1bad-4b90-4366-9187-8d27480f670b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:14.832489746 +0000 UTC m=+38.668418280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs") pod "network-metrics-daemon-6jzjr" (UID: "11bb1bad-4b90-4366-9187-8d27480f670b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.350696 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrfsd\" (UniqueName: \"kubernetes.io/projected/11bb1bad-4b90-4366-9187-8d27480f670b-kube-api-access-qrfsd\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.354012 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.368373 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.371074 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.371141 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.371159 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.371181 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.371197 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.385869 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.400439 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.415814 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.430859 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.444306 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.455704 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.474709 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.474825 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.474846 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.474898 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.474916 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.505545 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" event={"ID":"84f4e3f0-5001-4730-a1d4-64407794e5a8","Type":"ContainerStarted","Data":"39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.507885 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/1.log" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.513620 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/0.log" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.517167 4702 generic.go:334] "Generic (PLEG): container finished" podID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerID="bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed" exitCode=1 Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.517218 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.517284 4702 scope.go:117] "RemoveContainer" containerID="4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.518527 4702 scope.go:117] "RemoveContainer" containerID="bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed" Dec 03 11:04:14 crc kubenswrapper[4702]: E1203 11:04:14.518964 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.531560 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.543676 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.564347 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.577631 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.577998 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.578035 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.578049 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.578068 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.578082 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.587699 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.599017 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.616716 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.636575 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.646677 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.659144 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.670859 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.681108 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.681145 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.681154 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.681170 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.681180 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.684134 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.696208 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.709826 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.723504 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.737211 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.750837 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:14Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.783609 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.783654 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.783667 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.783684 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.783698 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.837181 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:14 crc kubenswrapper[4702]: E1203 11:04:14.837334 4702 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:14 crc kubenswrapper[4702]: E1203 11:04:14.837394 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs podName:11bb1bad-4b90-4366-9187-8d27480f670b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:15.837379181 +0000 UTC m=+39.673307645 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs") pod "network-metrics-daemon-6jzjr" (UID: "11bb1bad-4b90-4366-9187-8d27480f670b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.886570 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.886724 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.886826 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.886947 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.887030 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.928324 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.928379 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:14 crc kubenswrapper[4702]: E1203 11:04:14.928938 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:14 crc kubenswrapper[4702]: E1203 11:04:14.929033 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.990139 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.990217 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.990228 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.990248 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:14 crc kubenswrapper[4702]: I1203 11:04:14.990259 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:14Z","lastTransitionTime":"2025-12-03T11:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.093334 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.093392 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.093404 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.093422 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.093433 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.196665 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.196715 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.196727 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.196748 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.196776 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.299780 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.299828 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.299840 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.299856 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.299868 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.402738 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.402884 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.402907 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.402934 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.402953 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.505474 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.505509 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.505540 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.505553 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.505563 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.523025 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" event={"ID":"84f4e3f0-5001-4730-a1d4-64407794e5a8","Type":"ContainerStarted","Data":"5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.526387 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/1.log" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.544835 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.562501 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.581063 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.605992 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.608217 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.608278 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.608293 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.608311 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.608346 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.618455 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.629209 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.641376 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.654478 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.676534 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.687969 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.700680 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.710884 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.710934 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.710947 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.710964 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.710974 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.715008 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.726116 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.737891 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.752325 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.762986 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.775889 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.814073 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.814129 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.814142 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.814161 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.814172 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.848274 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:15 crc kubenswrapper[4702]: E1203 11:04:15.848540 4702 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:15 crc kubenswrapper[4702]: E1203 11:04:15.848683 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs podName:11bb1bad-4b90-4366-9187-8d27480f670b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:17.848644134 +0000 UTC m=+41.684572748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs") pod "network-metrics-daemon-6jzjr" (UID: "11bb1bad-4b90-4366-9187-8d27480f670b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.862553 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.882651 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.899384 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.916568 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.917451 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.917522 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.917535 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.917558 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.917571 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:15Z","lastTransitionTime":"2025-12-03T11:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.928165 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:15 crc kubenswrapper[4702]: E1203 11:04:15.928397 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.928996 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:15 crc kubenswrapper[4702]: E1203 11:04:15.929093 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.951103 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.980953 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:15 crc kubenswrapper[4702]: I1203 11:04:15.993797 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:15Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.025591 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.026009 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.026080 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.026099 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.026126 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.026144 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.049366 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.067736 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.082127 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.099542 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.114108 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.125576 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.132355 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.132400 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.132411 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.132427 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.132437 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.146106 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.158610 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.182363 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.200311 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.235260 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.235300 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.235311 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.235329 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.235339 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.338509 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.338684 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.338703 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.338728 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.338745 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.442193 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.442234 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.442246 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.442263 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.442274 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.545220 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.545321 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.545351 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.545393 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.545430 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.648646 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.648730 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.648751 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.649280 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.649303 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.751907 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.751965 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.751980 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.751998 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.752011 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.855383 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.855432 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.855446 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.855465 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.855485 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.927969 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.928059 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:16 crc kubenswrapper[4702]: E1203 11:04:16.928170 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:16 crc kubenswrapper[4702]: E1203 11:04:16.928233 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.956380 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.958706 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.958783 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.958800 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.958827 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.958839 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:16Z","lastTransitionTime":"2025-12-03T11:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.968951 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:16 crc kubenswrapper[4702]: I1203 11:04:16.987557 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.001718 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:16Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.013634 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.025100 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.035527 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.046298 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.057248 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.060627 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.060677 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.060694 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.060718 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.060735 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.072462 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.086187 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.100598 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.110958 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.136350 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.157718 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.162774 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.162802 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.162810 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.162825 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.162837 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.173391 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.186232 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:17Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.265956 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.266021 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.266038 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.266061 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.266076 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.368939 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.369190 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.369210 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.369239 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.369265 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.472292 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.472384 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.472405 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.472433 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.472445 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.575636 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.575697 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.575713 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.575732 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.575747 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.678347 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.678383 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.678395 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.678409 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.678417 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.781234 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.781280 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.781290 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.781303 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.781314 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.871082 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:17 crc kubenswrapper[4702]: E1203 11:04:17.871431 4702 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:17 crc kubenswrapper[4702]: E1203 11:04:17.871522 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs podName:11bb1bad-4b90-4366-9187-8d27480f670b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:21.871497051 +0000 UTC m=+45.707425555 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs") pod "network-metrics-daemon-6jzjr" (UID: "11bb1bad-4b90-4366-9187-8d27480f670b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.885032 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.885088 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.885102 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.885119 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.885133 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.927472 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.927472 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:17 crc kubenswrapper[4702]: E1203 11:04:17.927727 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:17 crc kubenswrapper[4702]: E1203 11:04:17.927744 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.987797 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.987853 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.987868 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.987889 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:17 crc kubenswrapper[4702]: I1203 11:04:17.987902 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:17Z","lastTransitionTime":"2025-12-03T11:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.091250 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.091307 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.091332 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.091347 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.091356 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.194459 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.194503 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.194515 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.194531 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.194541 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.298035 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.298109 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.298133 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.298165 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.298188 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.401516 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.401614 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.401640 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.401678 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.401702 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.504910 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.504997 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.505021 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.505053 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.505080 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.608655 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.608697 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.608709 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.608730 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.608742 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.711748 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.711853 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.711873 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.711905 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.711927 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.815452 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.815538 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.815569 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.815598 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.815624 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.918881 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.918938 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.918946 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.918965 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.918974 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:18Z","lastTransitionTime":"2025-12-03T11:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.927496 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:18 crc kubenswrapper[4702]: E1203 11:04:18.927684 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:18 crc kubenswrapper[4702]: I1203 11:04:18.927972 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:18 crc kubenswrapper[4702]: E1203 11:04:18.928076 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.021845 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.021905 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.021921 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.021944 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.021961 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.125145 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.125198 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.125211 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.125231 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.125242 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.227819 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.227895 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.227909 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.227927 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.227963 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.330062 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.330115 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.330126 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.330143 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.330154 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.432505 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.432568 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.432585 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.432604 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.432618 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.535676 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.535751 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.535823 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.535854 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.535880 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.639542 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.639620 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.639638 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.639665 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.639685 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.743127 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.743216 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.743229 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.743251 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.743264 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.846736 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.846810 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.846824 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.846844 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.846857 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.927322 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.927448 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:19 crc kubenswrapper[4702]: E1203 11:04:19.927555 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:19 crc kubenswrapper[4702]: E1203 11:04:19.927860 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.950215 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.950268 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.950290 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.950320 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:19 crc kubenswrapper[4702]: I1203 11:04:19.950338 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:19Z","lastTransitionTime":"2025-12-03T11:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.053555 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.053602 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.053637 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.053658 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.053670 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.157336 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.157399 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.157410 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.157430 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.157443 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.261340 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.261412 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.261430 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.262059 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.262130 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.366234 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.366317 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.366331 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.366354 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.366367 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.470062 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.470624 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.470645 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.470671 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.470684 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.573536 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.573586 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.573598 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.573618 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.573642 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.677066 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.677121 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.677132 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.677151 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.677163 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.780586 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.780649 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.780677 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.780699 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.780718 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.884682 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.884779 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.884796 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.884820 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.884833 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.928088 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.928154 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:20 crc kubenswrapper[4702]: E1203 11:04:20.928377 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:20 crc kubenswrapper[4702]: E1203 11:04:20.928571 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.987699 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.987797 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.987822 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.987854 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:20 crc kubenswrapper[4702]: I1203 11:04:20.987875 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:20Z","lastTransitionTime":"2025-12-03T11:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.091745 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.091831 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.091846 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.091868 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.091886 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.195148 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.195257 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.195271 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.195293 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.195306 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.298121 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.298175 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.298189 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.298217 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.298230 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.401778 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.401838 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.401854 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.401879 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.401892 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.504967 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.505030 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.505049 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.505070 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.505089 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.607424 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.607482 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.607499 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.607522 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.607535 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.710782 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.710833 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.710846 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.710866 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.710879 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.813553 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.813600 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.813612 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.813628 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.813640 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.916416 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.916462 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.916474 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.916491 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.916502 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:21Z","lastTransitionTime":"2025-12-03T11:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.918184 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:21 crc kubenswrapper[4702]: E1203 11:04:21.918340 4702 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:21 crc kubenswrapper[4702]: E1203 11:04:21.918446 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs podName:11bb1bad-4b90-4366-9187-8d27480f670b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:29.918420858 +0000 UTC m=+53.754349362 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs") pod "network-metrics-daemon-6jzjr" (UID: "11bb1bad-4b90-4366-9187-8d27480f670b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.927253 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:21 crc kubenswrapper[4702]: I1203 11:04:21.927361 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:21 crc kubenswrapper[4702]: E1203 11:04:21.927452 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:21 crc kubenswrapper[4702]: E1203 11:04:21.927592 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.020187 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.020254 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.020272 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.020301 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.020324 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.124016 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.124063 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.124076 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.124108 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.124119 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.226873 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.226940 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.226949 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.226970 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.226980 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.329115 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.329170 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.329180 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.329199 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.329208 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.431829 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.431868 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.431878 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.431897 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.431909 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.534203 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.534249 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.534264 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.534281 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.534296 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.636941 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.636986 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.636997 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.637011 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.637021 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.740282 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.740335 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.740349 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.740365 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.740376 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.843751 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.843838 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.843857 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.843886 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.843900 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.927983 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.928012 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:22 crc kubenswrapper[4702]: E1203 11:04:22.928133 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:22 crc kubenswrapper[4702]: E1203 11:04:22.928302 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.946067 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.946145 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.946157 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.946178 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:22 crc kubenswrapper[4702]: I1203 11:04:22.946191 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:22Z","lastTransitionTime":"2025-12-03T11:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.048943 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.048987 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.048998 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.049015 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.049026 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.151708 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.151777 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.151791 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.151812 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.151828 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.254428 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.254481 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.254492 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.254509 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.254519 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.357577 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.357633 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.357645 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.357665 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.357676 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.461067 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.461137 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.461153 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.461177 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.461198 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.564605 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.564682 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.564708 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.564742 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.564804 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.616195 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.616277 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.616292 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.616312 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.616323 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: E1203 11:04:23.639192 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:23Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.646704 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.646802 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.646820 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.646849 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.646871 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: E1203 11:04:23.666927 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:23Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.672614 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.672668 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.672681 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.672699 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.672709 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: E1203 11:04:23.685093 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:23Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.689311 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.689341 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.689349 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.689366 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.689376 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: E1203 11:04:23.702920 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:23Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.706291 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.706341 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.706351 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.706366 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.706378 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: E1203 11:04:23.718991 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:23Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:23 crc kubenswrapper[4702]: E1203 11:04:23.719120 4702 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.720874 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.720912 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.720926 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.720944 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.720959 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.823970 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.824017 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.824030 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.824047 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.824061 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.927965 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.928074 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:23 crc kubenswrapper[4702]: E1203 11:04:23.928156 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:23 crc kubenswrapper[4702]: E1203 11:04:23.928276 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.933313 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.933369 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.933383 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.933404 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:23 crc kubenswrapper[4702]: I1203 11:04:23.933417 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:23Z","lastTransitionTime":"2025-12-03T11:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.036078 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.036144 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.036153 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.036169 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.036181 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.139671 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.139708 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.139717 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.139731 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.139741 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.241947 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.242031 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.242045 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.242063 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.242087 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.345117 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.345168 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.345190 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.345225 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.345243 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.448104 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.448180 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.448192 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.448216 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.448236 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.556065 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.556120 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.556131 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.556151 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.556163 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.659683 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.659790 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.659820 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.659851 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.659877 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.763188 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.763257 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.763280 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.763308 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.763330 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.799676 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.809724 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.823367 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4fcca5071a7bf8b0725a1624f1754469f242a08e8e915d7dafb403e6b3955e70\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"message\\\":\\\"nfig.go:1031] Cluster endpoints for openshift-apiserver-operator/metrics for network=default are: map[]\\\\nI1203 11:04:11.371597 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI1203 11:04:11.371716 5954 obj_retry.go:386] Retry successful for *v1.Pod openshift-dns/node-resolver-b7lmv after 0 failed attempt(s)\\\\nI1203 11:04:11.371940 5954 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 11:04:11.371524 5954 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/iptables-alerter-4ln5h\\\\nF1203 11:04:11.372265 5954 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": fail\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.837102 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.852023 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.866842 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.866881 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.866892 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.866908 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.866921 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.868166 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.881040 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.896876 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.909032 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.922269 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.927560 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.927613 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:24 crc kubenswrapper[4702]: E1203 11:04:24.927724 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:24 crc kubenswrapper[4702]: E1203 11:04:24.927820 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.936516 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.951832 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.970702 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.970677 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.970789 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.970814 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.970841 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.970862 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:24Z","lastTransitionTime":"2025-12-03T11:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.984236 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:24 crc kubenswrapper[4702]: I1203 11:04:24.995407 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:24Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.013149 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:25Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.025926 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:25Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.037989 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:25Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.049174 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:25Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.073783 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.073846 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.073865 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.073884 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.073898 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.180749 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.180837 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.180848 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.180864 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.180873 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.283992 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.284052 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.284070 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.284093 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.284110 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.387185 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.387224 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.387236 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.387253 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.387266 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.491704 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.491824 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.491844 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.492323 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.492388 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.596176 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.596239 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.596256 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.596292 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.596321 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.699910 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.700031 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.700057 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.700089 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.700108 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.803160 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.803214 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.803231 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.803254 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.803270 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.906305 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.906370 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.906397 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.906419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.906432 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:25Z","lastTransitionTime":"2025-12-03T11:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.927782 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.927782 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:25 crc kubenswrapper[4702]: E1203 11:04:25.927938 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:25 crc kubenswrapper[4702]: E1203 11:04:25.928566 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.928777 4702 scope.go:117] "RemoveContainer" containerID="bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.946430 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:25Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.969457 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:25Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:25 crc kubenswrapper[4702]: I1203 11:04:25.991533 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:25Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.009185 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.009245 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.009180 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.009261 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.009394 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.009410 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.021868 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.047852 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.061410 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.077293 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.091839 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.104610 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.112546 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.112855 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.112998 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.113115 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.113217 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.117283 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.130456 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.146882 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.160115 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.175453 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.187863 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.199735 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.215739 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.216140 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.216373 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.216546 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.216724 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.219054 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.319645 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.319679 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.319688 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.319704 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.319714 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.422163 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.422198 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.422208 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.422222 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.422233 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.525394 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.525479 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.525497 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.525525 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.525542 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.570825 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/1.log" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.580779 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.581904 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.598402 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.614442 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.671037 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.671089 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.671216 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.671229 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.671270 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.671280 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.694618 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.715294 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.735953 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.750819 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.763830 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.774590 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.774664 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.774676 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.774698 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.774714 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.779539 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.796490 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.809970 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.821599 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.834919 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.848170 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.863403 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.877341 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.877378 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.877390 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.877407 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.877417 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.894361 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.918139 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.928040 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.928121 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:26 crc kubenswrapper[4702]: E1203 11:04:26.928191 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:26 crc kubenswrapper[4702]: E1203 11:04:26.928276 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.930121 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.944418 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.959548 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.971870 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.978646 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.978670 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.978681 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.978695 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.978704 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:26Z","lastTransitionTime":"2025-12-03T11:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.985096 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:26 crc kubenswrapper[4702]: I1203 11:04:26.997620 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:26Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.008598 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.021615 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.039360 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.052046 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.063546 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.075184 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.081156 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.081200 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.081212 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.081236 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.081253 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.084954 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.097348 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.119077 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.135940 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: E1203 11:04:27.149312 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa7620d_1ec2_4a53_ad2e_df64bb9aeac3.slice/crio-3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.149362 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.160127 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.173188 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.184968 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.185031 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.185048 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.185074 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.185092 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.288287 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.288332 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.288341 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.288357 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.288367 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.391914 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.392014 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.392036 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.392061 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.392079 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.494600 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.495117 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.495129 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.495150 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.495163 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.586534 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/2.log" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.587636 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/1.log" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.591259 4702 generic.go:334] "Generic (PLEG): container finished" podID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" exitCode=1 Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.591311 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.591358 4702 scope.go:117] "RemoveContainer" containerID="bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.592511 4702 scope.go:117] "RemoveContainer" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" Dec 03 11:04:27 crc kubenswrapper[4702]: E1203 11:04:27.592882 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.598539 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.598587 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.598596 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.598612 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.598621 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.623521 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.638559 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.647656 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.657415 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.675610 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf05cff364db3f92ced8b2af261169a03fe6f757ba09767eb175ca7a43faebed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"eck-source-55646444c4-trplf for pod on switch crc\\\\nI1203 11:04:12.284717 6127 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1203 11:04:12.284744 6127 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nI1203 11:04:12.284563 6127 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nF1203 11:04:12.284790 6127 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:27Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 11:04:27.122188 6319 lb_config.go:1031] Cluster endpoints for openshift-route-controller-manager/route-controller-manager for network=default are: map[]\\\\nF1203 11:04:27.122208 6319 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z]\\\\nI1203 11:04:27.122206 6319 services_controller.go:443] Built service openshift-route-controller-manager/route-controller-manager LB cluster-wide configs for network=default: []services.lbConfig{servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.687573 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.700910 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.701844 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.701891 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.701901 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.701925 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.701937 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.712900 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.724118 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.736147 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.748618 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.758713 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.770864 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.784035 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.796086 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.804117 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.804152 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.804162 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.804179 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.804188 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.810192 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.822591 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.832454 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.907026 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.907073 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.907084 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.907100 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.907110 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:27Z","lastTransitionTime":"2025-12-03T11:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.927614 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:27 crc kubenswrapper[4702]: E1203 11:04:27.928304 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:27 crc kubenswrapper[4702]: I1203 11:04:27.927687 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:27 crc kubenswrapper[4702]: E1203 11:04:27.928598 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.011737 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.011800 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.011809 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.011824 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.011835 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.114523 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.114569 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.114579 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.114595 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.114607 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.217114 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.217165 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.217175 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.217195 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.217207 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.320233 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.320302 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.320320 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.320347 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.320367 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.424494 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.424547 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.424559 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.424578 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.424590 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.528249 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.528302 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.528316 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.528336 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.528346 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.597852 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/2.log" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.631674 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.631716 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.631725 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.631744 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.631776 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.734205 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.734249 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.734258 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.734272 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.734282 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.800158 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.800436 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:00.800396445 +0000 UTC m=+84.636324929 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.800525 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.800566 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.800728 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.800746 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.800801 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.800814 4702 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.800751 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.800954 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 11:05:00.80093478 +0000 UTC m=+84.636863244 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.801005 4702 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.800886 4702 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.800597 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.801166 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 11:05:00.801120706 +0000 UTC m=+84.637049360 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.801229 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:05:00.801219378 +0000 UTC m=+84.637147842 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.801250 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.801330 4702 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.801370 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:05:00.801357332 +0000 UTC m=+84.637285796 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.837899 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.837978 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.838005 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.838040 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.838097 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.927840 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.927900 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.928036 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:28 crc kubenswrapper[4702]: E1203 11:04:28.928248 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.940125 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.940168 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.940178 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.940195 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:28 crc kubenswrapper[4702]: I1203 11:04:28.940207 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:28Z","lastTransitionTime":"2025-12-03T11:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.043548 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.043978 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.044109 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.044222 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.044307 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.147896 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.147979 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.147995 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.148026 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.148043 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.252183 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.252257 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.252269 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.252289 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.252300 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.355231 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.355267 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.355277 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.355402 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.355418 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.458959 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.459006 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.459016 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.459036 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.459047 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.516643 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.518074 4702 scope.go:117] "RemoveContainer" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" Dec 03 11:04:29 crc kubenswrapper[4702]: E1203 11:04:29.518342 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.534317 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.554248 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.561049 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.561117 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.561135 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.561155 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.561167 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.569449 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.582581 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.597131 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.615005 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.627209 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.649263 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:27Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 11:04:27.122188 6319 lb_config.go:1031] Cluster endpoints for openshift-route-controller-manager/route-controller-manager for network=default are: map[]\\\\nF1203 11:04:27.122208 6319 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z]\\\\nI1203 11:04:27.122206 6319 services_controller.go:443] Built service openshift-route-controller-manager/route-controller-manager LB cluster-wide configs for network=default: []services.lbConfig{servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.662342 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.663767 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.663797 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.663812 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.663831 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.663842 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.677035 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.692052 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.708696 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.725203 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.744180 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.763660 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.769094 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.769968 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.770050 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.770126 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.770224 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.788719 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.804951 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.820493 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:29Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.873734 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.873843 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.873866 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.873896 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.873916 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.927699 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:29 crc kubenswrapper[4702]: E1203 11:04:29.927896 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.927699 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:29 crc kubenswrapper[4702]: E1203 11:04:29.928545 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.976632 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.976720 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.976788 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.976825 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:29 crc kubenswrapper[4702]: I1203 11:04:29.976847 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:29Z","lastTransitionTime":"2025-12-03T11:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.015688 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:30 crc kubenswrapper[4702]: E1203 11:04:30.015908 4702 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:30 crc kubenswrapper[4702]: E1203 11:04:30.015986 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs podName:11bb1bad-4b90-4366-9187-8d27480f670b nodeName:}" failed. No retries permitted until 2025-12-03 11:04:46.015965849 +0000 UTC m=+69.851894313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs") pod "network-metrics-daemon-6jzjr" (UID: "11bb1bad-4b90-4366-9187-8d27480f670b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.079752 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.079815 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.079827 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.079848 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.079861 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.182775 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.182842 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.182867 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.182898 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.182938 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.286066 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.286134 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.286151 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.286173 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.286187 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.388272 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.388333 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.388345 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.388368 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.388383 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.491162 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.491205 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.491213 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.491228 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.491241 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.594089 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.594132 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.594142 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.594158 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.594168 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.696697 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.696746 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.696776 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.696794 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.696809 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.799960 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.800022 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.800042 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.800072 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.800095 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.904319 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.904377 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.904388 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.904406 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.904417 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:30Z","lastTransitionTime":"2025-12-03T11:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.927569 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:30 crc kubenswrapper[4702]: I1203 11:04:30.927662 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:30 crc kubenswrapper[4702]: E1203 11:04:30.927795 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:30 crc kubenswrapper[4702]: E1203 11:04:30.927878 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.006668 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.006718 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.006729 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.006750 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.006789 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.109681 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.109722 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.109736 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.109778 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.109790 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.212133 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.212182 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.212201 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.212223 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.212240 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.314906 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.314950 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.314962 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.314979 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.314990 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.418216 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.418269 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.418285 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.418301 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.418311 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.520593 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.520656 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.520673 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.520698 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.520719 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.623829 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.623886 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.623901 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.623923 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.623941 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.726678 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.726726 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.726735 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.726768 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.726777 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.830122 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.830178 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.830201 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.830243 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.830285 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.927058 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.927059 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:31 crc kubenswrapper[4702]: E1203 11:04:31.927253 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:31 crc kubenswrapper[4702]: E1203 11:04:31.927508 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.933257 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.933291 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.933301 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.933312 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:31 crc kubenswrapper[4702]: I1203 11:04:31.933322 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:31Z","lastTransitionTime":"2025-12-03T11:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.036383 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.036424 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.036437 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.036458 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.036470 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.140234 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.140300 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.140324 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.140355 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.140378 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.243228 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.243293 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.243309 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.243335 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.243348 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.346017 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.346077 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.346093 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.346111 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.346126 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.448519 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.448599 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.448624 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.448657 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.448681 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.552352 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.552411 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.552424 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.552445 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.552459 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.656069 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.656128 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.656141 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.656163 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.656179 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.758841 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.758908 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.758928 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.758993 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.759057 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.862673 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.862746 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.862802 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.862834 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.862860 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.927295 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.927300 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:32 crc kubenswrapper[4702]: E1203 11:04:32.927463 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:32 crc kubenswrapper[4702]: E1203 11:04:32.927622 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.965850 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.965887 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.965896 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.965910 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:32 crc kubenswrapper[4702]: I1203 11:04:32.965920 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:32Z","lastTransitionTime":"2025-12-03T11:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.068725 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.068795 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.068810 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.068828 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.068842 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.171170 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.171235 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.171254 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.171283 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.171300 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.274269 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.274346 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.274367 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.274393 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.274408 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.377944 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.378020 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.378046 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.378076 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.378100 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.481680 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.481734 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.481745 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.481785 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.481799 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.585934 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.586026 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.586053 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.586082 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.586102 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.689190 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.689253 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.689273 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.689299 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.689316 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.791958 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.792035 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.792058 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.792083 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.792101 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.895387 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.895497 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.895518 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.895549 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.895568 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.927654 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.927734 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:33 crc kubenswrapper[4702]: E1203 11:04:33.927837 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:33 crc kubenswrapper[4702]: E1203 11:04:33.927956 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.998400 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.998480 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.998503 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.998533 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:33 crc kubenswrapper[4702]: I1203 11:04:33.998558 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:33Z","lastTransitionTime":"2025-12-03T11:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.101064 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.101140 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.101158 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.101183 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.101200 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.108525 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.108566 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.108582 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.108607 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.108623 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: E1203 11:04:34.136070 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:34Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.147622 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.147691 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.147716 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.147748 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.147805 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: E1203 11:04:34.175605 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:34Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.179729 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.179794 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.179808 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.179826 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.179836 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: E1203 11:04:34.194064 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:34Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.198399 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.198439 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.198447 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.198461 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.198471 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: E1203 11:04:34.212081 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:34Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.215620 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.215659 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.215672 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.215690 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.215703 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: E1203 11:04:34.230577 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:34Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:34 crc kubenswrapper[4702]: E1203 11:04:34.230741 4702 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.232822 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.232868 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.232884 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.232904 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.232919 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.336050 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.336120 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.336134 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.336157 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.336172 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.439731 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.439866 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.439885 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.439910 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.439926 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.543093 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.543152 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.543169 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.543195 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.543211 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.646228 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.646282 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.646296 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.646315 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.646331 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.748811 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.748880 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.748899 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.748927 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.748947 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.851124 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.851169 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.851179 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.851197 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.851210 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.927941 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:34 crc kubenswrapper[4702]: E1203 11:04:34.928125 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.927966 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:34 crc kubenswrapper[4702]: E1203 11:04:34.928308 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.953346 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.953407 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.953424 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.953444 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:34 crc kubenswrapper[4702]: I1203 11:04:34.953455 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:34Z","lastTransitionTime":"2025-12-03T11:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.056139 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.056186 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.056198 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.056215 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.056227 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.159052 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.159102 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.159111 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.159128 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.159138 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.263066 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.263134 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.263153 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.263177 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.263191 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.366080 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.366145 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.366164 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.366193 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.366213 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.468922 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.468961 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.468970 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.468991 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.469000 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.570998 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.571043 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.571052 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.571071 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.571086 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.674377 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.674434 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.674446 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.674469 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.674481 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.777514 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.777588 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.777609 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.777635 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.777652 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.884736 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.885242 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.885478 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.885659 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.885908 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.927634 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.928440 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:35 crc kubenswrapper[4702]: E1203 11:04:35.928553 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:35 crc kubenswrapper[4702]: E1203 11:04:35.928722 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.988798 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.988861 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.988872 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.988899 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:35 crc kubenswrapper[4702]: I1203 11:04:35.988912 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:35Z","lastTransitionTime":"2025-12-03T11:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.092318 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.092384 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.092398 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.092419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.092433 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.195725 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.195828 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.195852 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.195880 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.195900 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.298690 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.298744 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.298772 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.298796 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.298813 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.401729 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.401860 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.401875 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.401899 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.401917 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.505017 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.505052 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.505061 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.505075 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.505084 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.608360 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.608419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.608438 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.608472 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.608489 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.711772 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.711826 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.711842 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.711863 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.711876 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.814528 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.815006 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.815018 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.815037 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.815048 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.918547 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.918608 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.918625 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.918651 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.918671 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:36Z","lastTransitionTime":"2025-12-03T11:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.927601 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:36 crc kubenswrapper[4702]: E1203 11:04:36.927739 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.927971 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:36 crc kubenswrapper[4702]: E1203 11:04:36.928147 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.965800 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:36Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:36 crc kubenswrapper[4702]: I1203 11:04:36.987810 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:36Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.004726 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.019988 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.021561 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.021686 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.021783 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.021898 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.021965 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.038379 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.053428 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.066468 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.078431 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.090477 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.102070 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.115741 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.124430 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.124505 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.124526 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.124550 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.124563 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.139297 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:27Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 11:04:27.122188 6319 lb_config.go:1031] Cluster endpoints for openshift-route-controller-manager/route-controller-manager for network=default are: map[]\\\\nF1203 11:04:27.122208 6319 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z]\\\\nI1203 11:04:27.122206 6319 services_controller.go:443] Built service openshift-route-controller-manager/route-controller-manager LB cluster-wide configs for network=default: []services.lbConfig{servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.153465 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.166475 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.181914 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.196900 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.209502 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.221158 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:37Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.228008 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.228041 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.228050 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.228066 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.228076 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.331259 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.331310 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.331324 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.331345 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.331359 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.435085 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.435171 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.435196 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.435225 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.435246 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.537997 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.538041 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.538053 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.538071 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.538080 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.640728 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.640820 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.640837 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.640860 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.640877 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.743595 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.743640 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.743650 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.743664 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.743673 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.845791 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.845827 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.845836 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.845855 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.845868 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.985792 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.985943 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:37 crc kubenswrapper[4702]: E1203 11:04:37.986032 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.986039 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.986075 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.986099 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.986230 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:37 crc kubenswrapper[4702]: E1203 11:04:37.986234 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.986275 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:37 crc kubenswrapper[4702]: I1203 11:04:37.986305 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:37Z","lastTransitionTime":"2025-12-03T11:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:37 crc kubenswrapper[4702]: E1203 11:04:37.986360 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.088963 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.089008 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.089020 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.089037 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.089048 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.191513 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.191550 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.191560 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.191575 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.191586 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.294077 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.294157 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.294176 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.294197 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.294210 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.397015 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.397066 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.397077 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.397094 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.397107 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.499835 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.499890 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.499903 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.499921 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.499937 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.602664 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.602745 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.602793 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.602823 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.602841 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.706196 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.706260 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.706278 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.706302 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.706321 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.808687 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.808729 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.808741 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.808782 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.808799 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.912062 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.912108 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.912121 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.912140 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.912152 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:38Z","lastTransitionTime":"2025-12-03T11:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:38 crc kubenswrapper[4702]: I1203 11:04:38.930409 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:38 crc kubenswrapper[4702]: E1203 11:04:38.930543 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.015573 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.015606 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.015615 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.015632 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.015641 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.119496 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.119576 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.119601 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.119633 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.119655 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.223108 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.223164 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.223175 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.223195 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.223206 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.325352 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.325389 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.325399 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.325419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.325429 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.428337 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.428373 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.428385 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.428401 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.428413 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.531830 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.531919 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.531936 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.531960 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.531975 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.634804 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.634863 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.634876 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.634897 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.634914 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.738238 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.738286 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.738299 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.738317 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.738329 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.841149 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.841205 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.841217 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.841235 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.841247 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.927692 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:39 crc kubenswrapper[4702]: E1203 11:04:39.927902 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.927711 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:39 crc kubenswrapper[4702]: E1203 11:04:39.928013 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.927711 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:39 crc kubenswrapper[4702]: E1203 11:04:39.928132 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.944571 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.944658 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.944673 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.944697 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:39 crc kubenswrapper[4702]: I1203 11:04:39.944714 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:39Z","lastTransitionTime":"2025-12-03T11:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.047930 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.047982 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.047993 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.048010 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.048023 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.151071 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.151136 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.151148 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.151175 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.151187 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.254211 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.254293 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.254308 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.254325 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.254338 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.357390 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.357443 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.357457 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.357476 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.357493 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.460680 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.460749 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.460828 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.460853 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.460866 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.564189 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.564249 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.564263 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.564292 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.564305 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.666852 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.666897 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.666908 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.666925 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.666937 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.770462 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.770532 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.770547 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.770596 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.770611 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.874454 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.874516 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.874530 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.874554 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.874572 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.928079 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:40 crc kubenswrapper[4702]: E1203 11:04:40.928268 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.978207 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.978255 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.978267 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.978287 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:40 crc kubenswrapper[4702]: I1203 11:04:40.978301 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:40Z","lastTransitionTime":"2025-12-03T11:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.081268 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.081318 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.081327 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.081344 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.081354 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.184067 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.184108 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.184118 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.184137 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.184146 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.286810 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.286856 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.286866 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.286883 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.286896 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.389196 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.390275 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.390445 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.390622 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.390831 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.493518 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.493841 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.493913 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.494019 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.494105 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.597278 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.597329 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.597347 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.597365 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.597379 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.699853 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.699892 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.699901 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.699916 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.699927 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.802421 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.802455 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.802466 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.802480 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.802492 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.905992 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.906050 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.906063 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.906087 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.906100 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:41Z","lastTransitionTime":"2025-12-03T11:04:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.927526 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.927548 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:41 crc kubenswrapper[4702]: E1203 11:04:41.927685 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:41 crc kubenswrapper[4702]: I1203 11:04:41.927549 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:41 crc kubenswrapper[4702]: E1203 11:04:41.927747 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:41 crc kubenswrapper[4702]: E1203 11:04:41.927818 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.008333 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.008382 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.008391 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.008405 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.008416 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.111570 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.111635 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.111653 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.111682 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.111704 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.214447 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.214526 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.214551 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.214580 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.214601 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.317491 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.317584 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.317601 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.317662 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.317679 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.421604 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.421664 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.421682 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.421706 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.421723 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.524277 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.524333 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.524343 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.524365 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.524378 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.627085 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.627159 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.627171 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.627189 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.627201 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.730094 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.730140 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.730153 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.730171 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.730183 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.832558 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.832633 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.832645 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.832675 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.832686 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.928222 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:42 crc kubenswrapper[4702]: E1203 11:04:42.928378 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.935457 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.935490 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.935504 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.935524 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:42 crc kubenswrapper[4702]: I1203 11:04:42.935536 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:42Z","lastTransitionTime":"2025-12-03T11:04:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.038301 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.038362 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.038377 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.038396 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.038408 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.141105 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.141175 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.141193 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.141219 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.141232 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.244345 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.244407 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.244419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.244438 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.244453 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.347472 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.347528 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.347545 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.347568 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.347585 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.451066 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.451123 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.451138 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.451164 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.451180 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.554521 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.554589 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.554603 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.554623 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.554634 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.656647 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.656686 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.656701 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.656721 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.656735 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.759576 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.759611 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.759621 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.759639 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.759649 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.862020 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.862365 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.862444 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.862537 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.862619 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.927655 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.927687 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.927801 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:43 crc kubenswrapper[4702]: E1203 11:04:43.928358 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:43 crc kubenswrapper[4702]: E1203 11:04:43.928270 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:43 crc kubenswrapper[4702]: E1203 11:04:43.928455 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.965021 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.965409 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.965502 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.965613 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:43 crc kubenswrapper[4702]: I1203 11:04:43.965695 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:43Z","lastTransitionTime":"2025-12-03T11:04:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.068128 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.068190 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.068206 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.068235 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.068252 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.171544 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.171596 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.171607 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.171625 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.171639 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.274443 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.274501 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.274514 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.274536 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.274550 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.290079 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.290127 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.290140 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.290158 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.290172 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: E1203 11:04:44.305580 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:44Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.309410 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.309459 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.309473 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.309492 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.309505 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: E1203 11:04:44.324176 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:44Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.328316 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.328458 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.328540 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.328649 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.328798 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: E1203 11:04:44.344496 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:44Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.348734 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.348800 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.348811 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.348828 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.348840 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: E1203 11:04:44.365286 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:44Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.370284 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.370347 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.370361 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.370388 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.370405 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: E1203 11:04:44.384901 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:44Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:44 crc kubenswrapper[4702]: E1203 11:04:44.385085 4702 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.386992 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.387034 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.387044 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.387070 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.387087 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.489647 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.489692 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.489702 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.489721 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.489733 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.592741 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.592836 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.592850 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.592873 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.592887 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.696153 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.696200 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.696213 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.696227 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.696237 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.798596 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.798636 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.798646 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.798663 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.798672 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.901257 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.901315 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.901331 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.901352 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.901365 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:44Z","lastTransitionTime":"2025-12-03T11:04:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.927742 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:44 crc kubenswrapper[4702]: E1203 11:04:44.927914 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:44 crc kubenswrapper[4702]: I1203 11:04:44.928649 4702 scope.go:117] "RemoveContainer" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" Dec 03 11:04:44 crc kubenswrapper[4702]: E1203 11:04:44.928954 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.003987 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.004046 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.004064 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.004091 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.004111 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.106992 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.107041 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.107053 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.107071 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.107083 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.209141 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.209185 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.209196 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.209212 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.209225 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.312474 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.312514 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.312528 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.312546 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.312558 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.416282 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.416330 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.416344 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.416360 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.416375 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.518711 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.518797 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.518819 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.518845 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.518860 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.621240 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.621290 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.621301 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.621321 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.621333 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.724537 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.724635 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.724648 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.724668 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.724680 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.828851 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.828909 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.828921 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.828941 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.828953 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.927824 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.927860 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.927950 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:45 crc kubenswrapper[4702]: E1203 11:04:45.928008 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:45 crc kubenswrapper[4702]: E1203 11:04:45.928172 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:45 crc kubenswrapper[4702]: E1203 11:04:45.928266 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.933844 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.933884 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.933897 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.933916 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:45 crc kubenswrapper[4702]: I1203 11:04:45.933929 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:45Z","lastTransitionTime":"2025-12-03T11:04:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.036961 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.037039 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.037056 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.037084 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.037103 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.093332 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:46 crc kubenswrapper[4702]: E1203 11:04:46.093547 4702 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:46 crc kubenswrapper[4702]: E1203 11:04:46.093631 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs podName:11bb1bad-4b90-4366-9187-8d27480f670b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:18.093597976 +0000 UTC m=+101.929526440 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs") pod "network-metrics-daemon-6jzjr" (UID: "11bb1bad-4b90-4366-9187-8d27480f670b") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.140402 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.140713 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.140878 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.141004 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.141108 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.243836 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.244194 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.244260 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.244330 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.244441 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.347499 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.347555 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.347571 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.347594 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.347609 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.450520 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.450588 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.450606 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.450628 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.450649 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.554054 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.554105 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.554118 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.554137 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.554150 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.657151 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.657189 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.657201 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.657219 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.657231 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.760091 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.760140 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.760164 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.760185 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.760201 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.863367 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.863436 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.863451 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.863471 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.863485 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.928210 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:46 crc kubenswrapper[4702]: E1203 11:04:46.928415 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.952510 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:27Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 11:04:27.122188 6319 lb_config.go:1031] Cluster endpoints for openshift-route-controller-manager/route-controller-manager for network=default are: map[]\\\\nF1203 11:04:27.122208 6319 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z]\\\\nI1203 11:04:27.122206 6319 services_controller.go:443] Built service openshift-route-controller-manager/route-controller-manager LB cluster-wide configs for network=default: []services.lbConfig{servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:46Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.965071 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:46Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.966137 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.966282 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.966636 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.966918 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.967084 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:46Z","lastTransitionTime":"2025-12-03T11:04:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.978329 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:46Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:46 crc kubenswrapper[4702]: I1203 11:04:46.994048 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:46Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.008343 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.023046 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.037597 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.050781 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.065736 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.070445 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.070493 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.070507 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.070534 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.070550 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.079087 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.094739 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.109060 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.124671 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.138527 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.164507 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.173987 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.174079 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.174108 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.174155 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.174182 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.186024 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.199045 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.213122 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:47Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.277105 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.277169 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.277183 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.277206 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.277219 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.379980 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.380040 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.380054 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.380078 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.380091 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.482890 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.482936 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.482949 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.482970 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.482984 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.585882 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.585935 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.585945 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.585962 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.585975 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.688408 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.688453 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.688467 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.688484 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.688498 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.791399 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.791472 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.791490 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.791522 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.791544 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.894163 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.894231 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.894244 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.894274 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.894290 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.927223 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.927318 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:47 crc kubenswrapper[4702]: E1203 11:04:47.927390 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.927244 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:47 crc kubenswrapper[4702]: E1203 11:04:47.927470 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:47 crc kubenswrapper[4702]: E1203 11:04:47.927591 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.996405 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.996461 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.996475 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.996490 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:47 crc kubenswrapper[4702]: I1203 11:04:47.996500 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:47Z","lastTransitionTime":"2025-12-03T11:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.099247 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.099286 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.099300 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.099318 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.099331 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.205064 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.205147 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.205180 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.205204 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.205219 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.307740 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.307802 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.307815 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.307830 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.307840 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.409994 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.410050 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.410064 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.410083 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.410095 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.513203 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.513242 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.513256 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.513274 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.513286 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.615995 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.616045 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.616062 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.616086 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.616111 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.719318 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.719377 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.719395 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.719420 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.719439 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.822226 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.822297 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.822315 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.822340 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.822357 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.925286 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.925324 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.925334 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.925352 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.925363 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:48Z","lastTransitionTime":"2025-12-03T11:04:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:48 crc kubenswrapper[4702]: I1203 11:04:48.929034 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:48 crc kubenswrapper[4702]: E1203 11:04:48.929237 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.028630 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.028686 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.028707 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.028729 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.028742 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.130937 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.130973 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.130983 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.130999 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.131010 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.233701 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.233740 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.233749 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.233782 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.233795 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.335681 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.335729 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.335738 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.335775 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.335787 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.438150 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.438212 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.438226 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.438249 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.438264 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.540788 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.540828 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.540838 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.540856 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.540867 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.643774 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.643818 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.643829 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.643844 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.643855 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.746069 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.746113 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.746126 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.746143 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.746155 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.848948 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.848996 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.849005 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.849023 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.849033 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.927622 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.927707 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.927738 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:49 crc kubenswrapper[4702]: E1203 11:04:49.928600 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:49 crc kubenswrapper[4702]: E1203 11:04:49.928171 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:49 crc kubenswrapper[4702]: E1203 11:04:49.929046 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.951744 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.951831 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.951841 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.951862 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:49 crc kubenswrapper[4702]: I1203 11:04:49.951874 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:49Z","lastTransitionTime":"2025-12-03T11:04:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.055292 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.055382 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.055401 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.055427 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.055445 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.157671 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.157719 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.157731 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.157750 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.157790 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.260526 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.260578 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.260590 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.260611 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.260625 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.364291 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.364359 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.364370 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.364388 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.364402 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.466968 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.467019 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.467032 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.467052 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.467065 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.570229 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.570288 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.570298 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.570313 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.570327 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.673314 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.673354 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.673366 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.673383 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.673396 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.677876 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pqn7q_72be0494-b56e-4d46-8300-decd11c66d66/kube-multus/0.log" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.677938 4702 generic.go:334] "Generic (PLEG): container finished" podID="72be0494-b56e-4d46-8300-decd11c66d66" containerID="4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e" exitCode=1 Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.677983 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pqn7q" event={"ID":"72be0494-b56e-4d46-8300-decd11c66d66","Type":"ContainerDied","Data":"4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.678449 4702 scope.go:117] "RemoveContainer" containerID="4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.690857 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.705513 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.742095 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:27Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 11:04:27.122188 6319 lb_config.go:1031] Cluster endpoints for openshift-route-controller-manager/route-controller-manager for network=default are: map[]\\\\nF1203 11:04:27.122208 6319 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z]\\\\nI1203 11:04:27.122206 6319 services_controller.go:443] Built service openshift-route-controller-manager/route-controller-manager LB cluster-wide configs for network=default: []services.lbConfig{servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.756223 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.769006 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.778976 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.779060 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.779075 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.779091 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.779101 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.780375 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.793260 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.804942 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.820581 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.833064 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.845158 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.858144 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.870435 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:50Z\\\",\\\"message\\\":\\\"2025-12-03T11:04:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_077ff955-c6e3-43d6-85bc-6a16dfdb7395\\\\n2025-12-03T11:04:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_077ff955-c6e3-43d6-85bc-6a16dfdb7395 to /host/opt/cni/bin/\\\\n2025-12-03T11:04:05Z [verbose] multus-daemon started\\\\n2025-12-03T11:04:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T11:04:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.881499 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.881974 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.882017 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.882033 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.882054 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.882070 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.899176 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.913196 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.924150 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.927488 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:50 crc kubenswrapper[4702]: E1203 11:04:50.927626 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.933862 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:50Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.985047 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.985091 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.985106 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.985125 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:50 crc kubenswrapper[4702]: I1203 11:04:50.985136 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:50Z","lastTransitionTime":"2025-12-03T11:04:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.088470 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.088552 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.088567 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.088599 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.088616 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.191461 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.191499 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.191508 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.191522 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.191531 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.294544 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.294601 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.294614 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.294632 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.294645 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.397708 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.397774 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.397787 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.397810 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.397822 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.500791 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.500859 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.500879 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.500906 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.500924 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.604468 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.604555 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.604587 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.604622 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.604643 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.685568 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pqn7q_72be0494-b56e-4d46-8300-decd11c66d66/kube-multus/0.log" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.685676 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pqn7q" event={"ID":"72be0494-b56e-4d46-8300-decd11c66d66","Type":"ContainerStarted","Data":"e471d9a027dca8cb83cc35128b6bfbae033ceaedc00e539a0fd45d154da78c4c"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.708427 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.708699 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.708735 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.708753 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.708832 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.708861 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.725071 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.743673 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.761551 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.780005 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.791425 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.811615 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.811695 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.811714 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.811749 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.811794 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.812399 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:27Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 11:04:27.122188 6319 lb_config.go:1031] Cluster endpoints for openshift-route-controller-manager/route-controller-manager for network=default are: map[]\\\\nF1203 11:04:27.122208 6319 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z]\\\\nI1203 11:04:27.122206 6319 services_controller.go:443] Built service openshift-route-controller-manager/route-controller-manager LB cluster-wide configs for network=default: []services.lbConfig{servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.824299 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.841938 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.854255 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.867675 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.882865 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.897713 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e471d9a027dca8cb83cc35128b6bfbae033ceaedc00e539a0fd45d154da78c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:50Z\\\",\\\"message\\\":\\\"2025-12-03T11:04:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_077ff955-c6e3-43d6-85bc-6a16dfdb7395\\\\n2025-12-03T11:04:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_077ff955-c6e3-43d6-85bc-6a16dfdb7395 to /host/opt/cni/bin/\\\\n2025-12-03T11:04:05Z [verbose] multus-daemon started\\\\n2025-12-03T11:04:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T11:04:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.909726 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.914460 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.914491 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.914501 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.914516 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.914524 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:51Z","lastTransitionTime":"2025-12-03T11:04:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.928135 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.928164 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.928202 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:51 crc kubenswrapper[4702]: E1203 11:04:51.928568 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:51 crc kubenswrapper[4702]: E1203 11:04:51.928686 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:51 crc kubenswrapper[4702]: E1203 11:04:51.928439 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.936613 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.952976 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.966875 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:51 crc kubenswrapper[4702]: I1203 11:04:51.980229 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:51Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.016653 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.016694 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.016703 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.016718 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.016727 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.119830 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.119896 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.119910 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.119929 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.119942 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.223280 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.223356 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.223374 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.223400 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.223419 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.326920 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.326997 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.327020 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.327053 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.327078 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.430687 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.430800 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.430827 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.430857 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.430879 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.533465 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.533517 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.533532 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.533553 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.533567 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.637032 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.637382 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.637493 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.637582 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.637657 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.740508 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.740569 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.740583 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.740607 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.740622 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.843214 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.843270 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.843287 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.843309 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.843324 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.928154 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:52 crc kubenswrapper[4702]: E1203 11:04:52.928359 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.945718 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.945804 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.945821 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.945845 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:52 crc kubenswrapper[4702]: I1203 11:04:52.945859 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:52Z","lastTransitionTime":"2025-12-03T11:04:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.048425 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.048473 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.048485 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.048508 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.048520 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.151700 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.151788 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.151808 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.151853 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.151881 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.255403 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.255488 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.255513 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.255543 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.255568 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.359124 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.359192 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.359218 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.359248 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.359270 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.462424 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.462493 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.462516 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.462546 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.462572 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.565655 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.565723 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.565741 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.565807 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.565828 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.669003 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.669048 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.669060 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.669078 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.669090 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.772362 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.772419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.772443 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.772472 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.772495 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.875501 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.875565 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.875586 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.875611 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.875628 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.927958 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.927995 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:53 crc kubenswrapper[4702]: E1203 11:04:53.928169 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.928440 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:53 crc kubenswrapper[4702]: E1203 11:04:53.928536 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:53 crc kubenswrapper[4702]: E1203 11:04:53.928816 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.983362 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.983440 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.983481 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.983514 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:53 crc kubenswrapper[4702]: I1203 11:04:53.983537 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:53Z","lastTransitionTime":"2025-12-03T11:04:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.087711 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.088232 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.088462 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.088693 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.088951 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.192888 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.192972 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.192997 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.193028 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.193050 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.296807 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.296872 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.296890 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.296917 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.296939 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.400293 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.400333 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.400343 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.400360 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.400371 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.501100 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.502099 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.502126 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.502157 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.502178 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: E1203 11:04:54.517710 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:54Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.521228 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.521272 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.521282 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.521297 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.521308 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: E1203 11:04:54.532902 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:54Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.536656 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.536693 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.536706 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.536727 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.536741 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: E1203 11:04:54.548676 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:54Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.551813 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.551844 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.551854 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.551869 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.551878 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: E1203 11:04:54.562636 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:54Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.565963 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.565999 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.566014 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.566031 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.566042 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: E1203 11:04:54.576141 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:54Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d83e9c9f-fe89-474f-892c-403dd3951eb1\\\",\\\"systemUUID\\\":\\\"6a3f38b6-c08e-4968-a85f-e1166e8e8498\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:54Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:54 crc kubenswrapper[4702]: E1203 11:04:54.576252 4702 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.577560 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.577591 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.577605 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.577617 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.577626 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.680733 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.680808 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.680827 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.680849 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.680866 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.784279 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.784350 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.784362 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.784379 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.784389 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.887584 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.887660 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.887680 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.887705 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.887725 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.927553 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:54 crc kubenswrapper[4702]: E1203 11:04:54.927719 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.991233 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.991639 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.991735 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.991864 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:54 crc kubenswrapper[4702]: I1203 11:04:54.991938 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:54Z","lastTransitionTime":"2025-12-03T11:04:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.095164 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.095255 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.095276 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.095305 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.095322 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.198431 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.198503 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.198520 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.198548 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.198566 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.300900 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.300980 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.300997 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.301022 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.301035 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.403463 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.403525 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.403542 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.403561 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.403576 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.506200 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.506245 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.506254 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.506268 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.506279 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.608942 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.609000 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.609016 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.609037 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.609051 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.711918 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.711995 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.712016 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.712046 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.712068 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.815414 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.815472 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.815484 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.815501 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.815513 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.918771 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.918833 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.918843 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.918857 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.918867 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:55Z","lastTransitionTime":"2025-12-03T11:04:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.927357 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.927378 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:55 crc kubenswrapper[4702]: E1203 11:04:55.927596 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:55 crc kubenswrapper[4702]: I1203 11:04:55.927406 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:55 crc kubenswrapper[4702]: E1203 11:04:55.927754 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:55 crc kubenswrapper[4702]: E1203 11:04:55.927887 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.020988 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.021050 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.021063 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.021077 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.021086 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.124485 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.124560 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.124576 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.124611 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.124628 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.227085 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.227497 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.227600 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.227708 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.227821 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.331780 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.331828 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.331841 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.331861 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.331878 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.435645 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.435722 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.435746 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.435809 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.435834 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.538693 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.538787 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.538804 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.538822 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.538843 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.641812 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.641864 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.641876 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.641894 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.641913 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.743861 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.743907 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.743919 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.743941 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.743956 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.847297 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.847388 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.847401 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.847418 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.847449 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.927253 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:56 crc kubenswrapper[4702]: E1203 11:04:56.927421 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.943641 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aae254b9d9e85dbec4e628c23dde87301b57cae80ee0c460a346ca22ef1f417c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51047cac1180919e48306fb1d844f05b6bf9cafa4861d260f3b104777dde49ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:56Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.949571 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.949620 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.949638 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.949659 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.949674 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:56Z","lastTransitionTime":"2025-12-03T11:04:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.959273 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11a3062eee7f921998879157371abf70717b50d77b3982f44ffe160822201aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:56Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.972840 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:56Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:56 crc kubenswrapper[4702]: I1203 11:04:56.993093 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-z8lld" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bdf4071-59bc-4d40-80ee-20027ce42805\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6e6f7f89b6a8a311c3d34715c94ecdba4569f9226550362c187b46191514789d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b720b54969376c69d31a54f3439ce8adf2b9935bfd0d74464666f71471420039\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6b682b89ba9ff105376ecb84bb50627ba1fb86da5e7fab441a00e3753fbeb912\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6f28e9c3b21d8c950adbd9fabe6e27f9ee595316afed77d6819dbc4929ecd437\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f30b58bf28093ad9fbec54c91cc871f415a83179ef6558a1712d228290b6f838\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84cc8a2653c06334aa142c688cc905315e4422c9e8f09bd6670b975b0ad39d91\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c0292668eafa18c82603903719cfd1aac4335fa9b5ee2a07e760a2cccfcb6201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cf6j8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-z8lld\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:56Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.005898 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pqn7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"72be0494-b56e-4d46-8300-decd11c66d66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e471d9a027dca8cb83cc35128b6bfbae033ceaedc00e539a0fd45d154da78c4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:50Z\\\",\\\"message\\\":\\\"2025-12-03T11:04:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_077ff955-c6e3-43d6-85bc-6a16dfdb7395\\\\n2025-12-03T11:04:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_077ff955-c6e3-43d6-85bc-6a16dfdb7395 to /host/opt/cni/bin/\\\\n2025-12-03T11:04:05Z [verbose] multus-daemon started\\\\n2025-12-03T11:04:05Z [verbose] Readiness Indicator file check\\\\n2025-12-03T11:04:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:01Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6vq2s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pqn7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.015746 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"11bb1bad-4b90-4366-9187-8d27480f670b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:14Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qrfsd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:14Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-6jzjr\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.041805 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"568c1dba-d524-4be3-b3e0-f594070999cb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819a48d04516e215a4e6917f404cf97a47f8c49eed888d3d789d069ed25f6978\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f9679a6adb82d736c62f01f7c40ae634ceb0d0e8adb03418c12a3e75698788b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://65664b2c2d95ecf90d4ad1214824a2ddeff5dca6ab7532421591eaddd7b5bac9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://031fa1d137175dc00c267f4a13314c9f68cdcaaa7c326a6028b201560c4e78b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa6c07b8bd838ebee0fb5809374a98112b8ee6450513aea6b0ec6f52a20c6c25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c228d528a4ea24c4972054750125093b823fd2effffc8197119bfdc4446bc68b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d26c0b4a3f76188d26a8a89c9c7f08aadb67acd9c54f84da55bd51b7ab72c004\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d2af6b6b2c8c884f8563974b8878ed077e6f1e81a5ce96b646b8995dd55a7e18\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.052731 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.052853 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.052896 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.052912 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.052922 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.056636 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea51e23b-8c79-4010-a539-e0f35cceefde\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 11:03:50.486629 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 11:03:50.489330 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2005926602/tls.crt::/tmp/serving-cert-2005926602/tls.key\\\\\\\"\\\\nI1203 11:03:56.372671 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 11:03:56.375557 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 11:03:56.375586 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 11:03:56.375619 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 11:03:56.375671 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 11:03:56.381482 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1203 11:03:56.381509 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381515 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 11:03:56.381520 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 11:03:56.381524 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 11:03:56.381528 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 11:03:56.381532 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1203 11:03:56.381572 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1203 11:03:56.384648 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:39Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.066795 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-b7lmv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"418b87ff-9c02-4b43-9bc3-3ce38c1df3a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebde427e348b8fb1f0a6acff717a9397f367e3ecf9cc3b2a1ed98302784dc65e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9sbgn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-b7lmv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.076525 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84f4e3f0-5001-4730-a1d4-64407794e5a8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39497a6176907f58fdff906f1529f469a849df1145dc5bb443d92532ff63fc4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5d66df8c5621d1be46d42066c748a841bc2df96ffb398c79deb2b72dcab42fae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qchb9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:12Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-k9mzz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.090581 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d2e03cb6-21dc-460c-a68e-17aafd79e258\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79df66dc63763499420c4ba6323f8680b2485f5124b90fcaf08062f18c95e9b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jp2zm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:59Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qf5sd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.112990 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T11:04:27Z\\\",\\\"message\\\":\\\"s:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {ba175bbe-5cc4-47e6-a32d-57693e1320bd}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1203 11:04:27.122188 6319 lb_config.go:1031] Cluster endpoints for openshift-route-controller-manager/route-controller-manager for network=default are: map[]\\\\nF1203 11:04:27.122208 6319 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:27Z is after 2025-08-24T17:21:41Z]\\\\nI1203 11:04:27.122206 6319 services_controller.go:443] Built service openshift-route-controller-manager/route-controller-manager LB cluster-wide configs for network=default: []services.lbConfig{servi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:26Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mt92m_openshift-ovn-kubernetes(ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:04:00Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:04:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lx4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mt92m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.124115 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-lcdkx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14795f18-bfe4-4ea9-b2a7-329e83234c68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38be109a19c4a94b91a2f1426a5249a37910a5e4aa0dedf0adc228081d7cec73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:04:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k4mwm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:04:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-lcdkx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.137210 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"564273ce-9ce2-489b-ab97-d39f7c580f32\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65c725e724a45091af5d015e8fad7496bdeedd1f0ede1d69661ac47afd7e7985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b7a7fb2bc03579ea224462d115c521c411ac741ea73f916ab665e910e21f6b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44adc2481cddca40f9b98aa25b6b8a381d6053a93c2289709dc1aefbb51fd05f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.148062 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3afeac62-9097-4b7a-a5cb-3785275cb50f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:04:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://269996cd0ce3b156195dd9d171328df7e7c02406316482eb33a4ba3fc9d79c15\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2491115c5325a34ad633eaad2ed3fb728c89279034c4955853558c6c249a6708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1109d8de27f2e277de0c5839d1ae7ff6bac66b7ab32cf2829ec620298d64782a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://814086c27751498e48edfdacb417f2b8364bc6918d6b9f1f408dccb031246fbc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T11:03:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T11:03:37Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T11:03:37Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.155698 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.155733 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.155744 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.155773 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.155782 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.162885 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:58Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bdb2ca6aba88bf910306e9f3f58732873bae986f20bb8d304efb05935bc5075\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T11:03:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.174120 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.187679 4702 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T11:03:56Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T11:04:57Z is after 2025-08-24T17:21:41Z" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.260620 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.260657 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.260667 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.260681 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.260690 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.364394 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.364456 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.364473 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.364494 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.364507 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.467108 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.467154 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.467166 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.467186 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.467200 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.569602 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.569644 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.569654 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.569672 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.569685 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.673260 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.673681 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.673874 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.673964 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.674048 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.777168 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.777218 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.777227 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.777242 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.777252 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.880922 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.880995 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.881007 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.881028 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.881040 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.927622 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.927723 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.927742 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:57 crc kubenswrapper[4702]: E1203 11:04:57.927859 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:57 crc kubenswrapper[4702]: E1203 11:04:57.928038 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:57 crc kubenswrapper[4702]: E1203 11:04:57.928103 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.984190 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.984247 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.984265 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.984283 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:57 crc kubenswrapper[4702]: I1203 11:04:57.984295 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:57Z","lastTransitionTime":"2025-12-03T11:04:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.087023 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.087095 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.087113 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.087137 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.087150 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.190275 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.190321 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.190335 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.190355 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.190368 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.293886 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.293957 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.293980 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.294013 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.294039 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.397067 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.397138 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.397158 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.397187 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.397212 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.499959 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.500006 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.500024 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.500051 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.500064 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.603211 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.603275 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.603289 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.603311 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.603325 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.707033 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.707165 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.707230 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.707261 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.707326 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.810674 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.810736 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.810749 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.810794 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.810806 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.913549 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.913627 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.913653 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.913684 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.913709 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:58Z","lastTransitionTime":"2025-12-03T11:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.928098 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:04:58 crc kubenswrapper[4702]: E1203 11:04:58.928772 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:04:58 crc kubenswrapper[4702]: I1203 11:04:58.929131 4702 scope.go:117] "RemoveContainer" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.016593 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.016957 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.016970 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.016987 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.016998 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.119699 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.119745 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.119776 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.119795 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.119807 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.223345 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.223385 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.223404 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.223419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.223429 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.326323 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.326364 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.326377 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.326392 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.326403 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.430008 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.430053 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.430066 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.430091 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.430100 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.532807 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.532858 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.532871 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.532890 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.532901 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.636621 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.636672 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.636683 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.636701 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.636717 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.715399 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/2.log" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.718284 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerStarted","Data":"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.718989 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.740300 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.740418 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.740432 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.740453 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.740478 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.775946 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=62.775893494 podStartE2EDuration="1m2.775893494s" podCreationTimestamp="2025-12-03 11:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.75126448 +0000 UTC m=+83.587192944" watchObservedRunningTime="2025-12-03 11:04:59.775893494 +0000 UTC m=+83.611821998" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.776478 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=62.776470995 podStartE2EDuration="1m2.776470995s" podCreationTimestamp="2025-12-03 11:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.774936129 +0000 UTC m=+83.610864643" watchObservedRunningTime="2025-12-03 11:04:59.776470995 +0000 UTC m=+83.612399459" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.783914 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6jzjr"] Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.784101 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:04:59 crc kubenswrapper[4702]: E1203 11:04:59.784238 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.792841 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b7lmv" podStartSLOduration=61.792811065 podStartE2EDuration="1m1.792811065s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.792739792 +0000 UTC m=+83.628668276" watchObservedRunningTime="2025-12-03 11:04:59.792811065 +0000 UTC m=+83.628739529" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.813541 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-k9mzz" podStartSLOduration=60.813520355 podStartE2EDuration="1m0.813520355s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.812958894 +0000 UTC m=+83.648887358" watchObservedRunningTime="2025-12-03 11:04:59.813520355 +0000 UTC m=+83.649448819" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.832029 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podStartSLOduration=61.832006633 podStartE2EDuration="1m1.832006633s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.831892179 +0000 UTC m=+83.667820653" watchObservedRunningTime="2025-12-03 11:04:59.832006633 +0000 UTC m=+83.667935097" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.843874 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.843947 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.843962 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.843981 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.843996 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.863162 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podStartSLOduration=60.863124975 podStartE2EDuration="1m0.863124975s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.862681299 +0000 UTC m=+83.698609753" watchObservedRunningTime="2025-12-03 11:04:59.863124975 +0000 UTC m=+83.699053439" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.876079 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-lcdkx" podStartSLOduration=61.876051479 podStartE2EDuration="1m1.876051479s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.87524263 +0000 UTC m=+83.711171094" watchObservedRunningTime="2025-12-03 11:04:59.876051479 +0000 UTC m=+83.711979943" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.912778 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.912727625 podStartE2EDuration="35.912727625s" podCreationTimestamp="2025-12-03 11:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.909503537 +0000 UTC m=+83.745432011" watchObservedRunningTime="2025-12-03 11:04:59.912727625 +0000 UTC m=+83.748656089" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.913606 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=62.913593227 podStartE2EDuration="1m2.913593227s" podCreationTimestamp="2025-12-03 11:03:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:04:59.894239697 +0000 UTC m=+83.730168161" watchObservedRunningTime="2025-12-03 11:04:59.913593227 +0000 UTC m=+83.749521701" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.928887 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:04:59 crc kubenswrapper[4702]: E1203 11:04:59.929015 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.929168 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:04:59 crc kubenswrapper[4702]: E1203 11:04:59.929224 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.948035 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.948067 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.948076 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.948088 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:04:59 crc kubenswrapper[4702]: I1203 11:04:59.948114 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:04:59Z","lastTransitionTime":"2025-12-03T11:04:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.031679 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z8lld" podStartSLOduration=62.031659859 podStartE2EDuration="1m2.031659859s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:00.031071318 +0000 UTC m=+83.866999782" watchObservedRunningTime="2025-12-03 11:05:00.031659859 +0000 UTC m=+83.867588323" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.050120 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pqn7q" podStartSLOduration=62.050100276 podStartE2EDuration="1m2.050100276s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:00.049478363 +0000 UTC m=+83.885406827" watchObservedRunningTime="2025-12-03 11:05:00.050100276 +0000 UTC m=+83.886028750" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.050656 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.050693 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.050705 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.050721 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.050731 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.152786 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.152830 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.152842 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.152860 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.152871 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.255419 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.255502 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.255523 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.255551 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.255574 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.358585 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.358646 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.358659 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.358677 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.358687 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.461671 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.461733 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.461745 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.461780 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.461795 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.566596 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.566650 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.566668 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.566693 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.566711 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.668682 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.668780 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.668801 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.668828 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.668845 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.772437 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.772511 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.772530 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.772559 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.772578 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.866665 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.866939 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:06:04.866905699 +0000 UTC m=+148.702834163 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.867064 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.867096 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.867147 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.867204 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867208 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867236 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867254 4702 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867273 4702 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867306 4702 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867314 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 11:06:04.867292464 +0000 UTC m=+148.703220968 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867336 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:06:04.867327955 +0000 UTC m=+148.703256519 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867214 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867351 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 11:06:04.867342145 +0000 UTC m=+148.703270699 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867492 4702 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867530 4702 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.867637 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 11:06:04.867611055 +0000 UTC m=+148.703539529 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.875563 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.875598 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.875613 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.875633 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.875645 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.927784 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:05:00 crc kubenswrapper[4702]: E1203 11:05:00.927955 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.979148 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.979227 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.979252 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.979276 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:00 crc kubenswrapper[4702]: I1203 11:05:00.979289 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:00Z","lastTransitionTime":"2025-12-03T11:05:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.082600 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.082650 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.082672 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.082690 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.082708 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.185127 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.185179 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.185192 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.185208 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.185219 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.288137 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.288211 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.288229 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.288250 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.288262 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.391628 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.391675 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.391684 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.391700 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.391711 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.494139 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.494261 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.494274 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.494291 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.494300 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.597450 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.597500 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.597512 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.597530 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.597544 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.699985 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.700041 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.700058 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.700076 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.700088 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.803369 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.803470 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.803488 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.803514 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.803530 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.907197 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.907270 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.907290 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.907315 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.907332 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:01Z","lastTransitionTime":"2025-12-03T11:05:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.927400 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.927917 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:05:01 crc kubenswrapper[4702]: E1203 11:05:01.928532 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 11:05:01 crc kubenswrapper[4702]: E1203 11:05:01.928650 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.928898 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:05:01 crc kubenswrapper[4702]: E1203 11:05:01.929118 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6jzjr" podUID="11bb1bad-4b90-4366-9187-8d27480f670b" Dec 03 11:05:01 crc kubenswrapper[4702]: I1203 11:05:01.941355 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.010810 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.010851 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.010863 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.010884 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.010897 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:02Z","lastTransitionTime":"2025-12-03T11:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.114167 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.114345 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.114389 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.114477 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.114507 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:02Z","lastTransitionTime":"2025-12-03T11:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.218204 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.218254 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.218263 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.218279 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.218288 4702 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T11:05:02Z","lastTransitionTime":"2025-12-03T11:05:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.321342 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.321415 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.321427 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.321444 4702 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.321606 4702 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.359829 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gzc7n"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.360419 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.361472 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.362129 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.362958 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5ghh"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.363255 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.364486 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.365114 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.365233 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.365426 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.365534 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.365737 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.365864 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.365979 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.366205 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.366321 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.366621 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.366730 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.368376 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5xk7q"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.369053 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.369733 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.370192 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.370916 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.371573 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-g8wwh"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.371939 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.372358 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.376549 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qxpq4"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.377113 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.377395 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.378068 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.378339 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.378614 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.379364 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.380190 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.387742 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.410600 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.415315 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.416924 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.417781 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.428141 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.428396 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.428503 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.428636 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.428672 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.428849 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.429091 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.430835 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.431263 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.431713 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.431768 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.431826 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.431890 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.432048 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.432173 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.432179 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.432827 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433119 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433195 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433292 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433457 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433565 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433608 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433579 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433732 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433744 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433809 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433899 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433973 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434012 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.433984 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434117 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434214 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434231 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434325 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434421 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434436 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434522 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434556 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434587 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434643 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434364 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434672 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434593 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434786 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434371 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434920 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434427 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.434791 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.435043 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.435163 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.435310 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.435391 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.435449 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.435564 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.435725 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-hndf6"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.436226 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.440129 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.440357 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.440502 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.440875 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v6p66"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.441385 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ccdtg"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.441699 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.442120 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.442472 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.442830 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.443306 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-njlsm"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.442120 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.443489 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.443706 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.444093 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.444238 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.444258 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.444387 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.444410 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.444496 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.444610 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.444863 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.445479 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.445518 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.447672 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.448221 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.452617 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.453095 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.453615 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.454513 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-x85q2"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.455656 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.457518 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gzc7n"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.478170 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.478433 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.478649 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.479652 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.482390 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.486672 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493327 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/afd002b9-3309-4286-a7cb-29c2f4817feb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fppws\" (UID: \"afd002b9-3309-4286-a7cb-29c2f4817feb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493371 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a456460b-47a3-48ef-a98b-4f67709d5939-serving-cert\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493412 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-etcd-client\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493435 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstnx\" (UniqueName: \"kubernetes.io/projected/be3d3f8d-f407-4b54-8e9b-a5b526babb52-kube-api-access-jstnx\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493459 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-audit-policies\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493480 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d49febe3-b867-419d-955c-a9a7b0a658c3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493511 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-image-import-ca\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493534 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-encryption-config\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493554 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493905 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.494883 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493559 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.495074 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.495101 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvbr\" (UniqueName: \"kubernetes.io/projected/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-kube-api-access-qfvbr\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.493559 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.497062 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.494117 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.497358 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.494221 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.494624 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.498059 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.498229 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.500917 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.495154 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-serving-cert\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.504652 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.503634 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-client\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.511783 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.511838 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512233 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512276 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/906f2138-1584-4399-8c87-d03f3231ddc7-config\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512314 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-dir\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512338 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkxwh\" (UniqueName: \"kubernetes.io/projected/b8e73047-6376-4bd9-8ec8-5966f8786e5d-kube-api-access-vkxwh\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512363 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-etcd-client\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512390 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-config\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512412 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b44be4-8fc9-4b74-9339-8bb658d866fc-config\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512537 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512605 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc476661-200c-4feb-8b45-fe203009356a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512664 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512676 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.512812 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.513038 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.513515 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/906f2138-1584-4399-8c87-d03f3231ddc7-auth-proxy-config\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514127 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5c1c8f7-5c07-4735-aa10-a7885f5bec8f-metrics-tls\") pod \"dns-operator-744455d44c-qxpq4\" (UID: \"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514161 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfcgt\" (UniqueName: \"kubernetes.io/projected/afd002b9-3309-4286-a7cb-29c2f4817feb-kube-api-access-tfcgt\") pod \"cluster-samples-operator-665b6dd947-fppws\" (UID: \"afd002b9-3309-4286-a7cb-29c2f4817feb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514228 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfjp2\" (UniqueName: \"kubernetes.io/projected/a456460b-47a3-48ef-a98b-4f67709d5939-kube-api-access-jfjp2\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514377 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8e73047-6376-4bd9-8ec8-5966f8786e5d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514409 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-audit\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514450 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxjp7\" (UniqueName: \"kubernetes.io/projected/906f2138-1584-4399-8c87-d03f3231ddc7-kube-api-access-pxjp7\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514479 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-node-pullsecrets\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514545 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-client-ca\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514608 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7be37fc-7374-46ae-a0e3-1cafab3430ec-audit-dir\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514637 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a456460b-47a3-48ef-a98b-4f67709d5939-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514697 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-serving-cert\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514816 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514943 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-encryption-config\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514967 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/906f2138-1584-4399-8c87-d03f3231ddc7-machine-approver-tls\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.514989 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515005 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-etcd-serving-ca\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515019 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37b44be4-8fc9-4b74-9339-8bb658d866fc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515034 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-config\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515099 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515114 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8e73047-6376-4bd9-8ec8-5966f8786e5d-config\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515131 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37b44be4-8fc9-4b74-9339-8bb658d866fc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515187 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-ca\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515225 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d49febe3-b867-419d-955c-a9a7b0a658c3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515243 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01e24add-292c-4a3c-8a32-75ceb16ced89-serving-cert\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515261 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515281 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515347 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qgm\" (UniqueName: \"kubernetes.io/projected/2c99e1fd-b0d0-418c-bb67-f638f06978f2-kube-api-access-r5qgm\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515368 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515388 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc476661-200c-4feb-8b45-fe203009356a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515415 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515468 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515539 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-policies\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515598 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b8e73047-6376-4bd9-8ec8-5966f8786e5d-images\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515621 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-config\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515647 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d49febe3-b867-419d-955c-a9a7b0a658c3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515676 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d3f8d-f407-4b54-8e9b-a5b526babb52-serving-cert\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515696 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8sbd\" (UniqueName: \"kubernetes.io/projected/d7be37fc-7374-46ae-a0e3-1cafab3430ec-kube-api-access-v8sbd\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515715 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfm8t\" (UniqueName: \"kubernetes.io/projected/e5c1c8f7-5c07-4735-aa10-a7885f5bec8f-kube-api-access-xfm8t\") pod \"dns-operator-744455d44c-qxpq4\" (UID: \"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515743 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-service-ca\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515779 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj6d5\" (UniqueName: \"kubernetes.io/projected/01e24add-292c-4a3c-8a32-75ceb16ced89-kube-api-access-hj6d5\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515804 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-audit-dir\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515828 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d49febe3-b867-419d-955c-a9a7b0a658c3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515849 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d49febe3-b867-419d-955c-a9a7b0a658c3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.515870 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc476661-200c-4feb-8b45-fe203009356a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.519354 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.519878 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.521174 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.521701 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.523319 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.525651 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nnrsp"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.526351 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.526696 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.526898 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.528850 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.529572 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.529803 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnnbr"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.530210 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.530654 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.531081 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.534528 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.535051 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.538562 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.541970 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.542295 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-54zxt"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.543173 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.543333 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.543406 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.547214 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.547550 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.548088 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.556509 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.558385 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-r54j5"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.558724 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.559545 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.567952 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.568434 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bnkx7"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.571022 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.571971 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ql9kq"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.572075 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.572989 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.575837 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.578676 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5ghh"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.584817 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.585930 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-f8vb8"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.586941 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.587191 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.590864 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5xk7q"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.592149 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bpfvn"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.593097 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bpfvn" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.593520 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.597409 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-nqtkf"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.598295 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.601354 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.602835 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.610533 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-g8wwh"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.612975 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.613889 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.616293 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pscld"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.616732 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d49febe3-b867-419d-955c-a9a7b0a658c3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.616789 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01e24add-292c-4a3c-8a32-75ceb16ced89-serving-cert\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.616820 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a56343e-6342-4982-9ff1-8bae70d5771a-proxy-tls\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617007 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617066 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af4eb68c-2986-486a-9b9a-4905bd264322-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617090 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b38e6ac-e12a-4798-b0e9-6321dc926487-service-ca-bundle\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617109 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617128 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qgm\" (UniqueName: \"kubernetes.io/projected/2c99e1fd-b0d0-418c-bb67-f638f06978f2-kube-api-access-r5qgm\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617155 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617174 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc476661-200c-4feb-8b45-fe203009356a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617192 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-config\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617210 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617225 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617246 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b8e73047-6376-4bd9-8ec8-5966f8786e5d-images\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617265 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-config\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617282 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-trusted-ca-bundle\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617298 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-profile-collector-cert\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617316 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4vqb\" (UniqueName: \"kubernetes.io/projected/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-kube-api-access-x4vqb\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617345 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-policies\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617365 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d49febe3-b867-419d-955c-a9a7b0a658c3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617386 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-default-certificate\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617406 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-serving-cert\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617425 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw4kb\" (UniqueName: \"kubernetes.io/projected/db0547ad-edd5-4b70-af0f-9f606793e6a9-kube-api-access-pw4kb\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617443 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfm8t\" (UniqueName: \"kubernetes.io/projected/e5c1c8f7-5c07-4735-aa10-a7885f5bec8f-kube-api-access-xfm8t\") pod \"dns-operator-744455d44c-qxpq4\" (UID: \"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617460 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflsr\" (UniqueName: \"kubernetes.io/projected/d5cab780-360c-48fa-9c88-36e15f37da47-kube-api-access-fflsr\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617482 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-config\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617500 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d3f8d-f407-4b54-8e9b-a5b526babb52-serving-cert\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617516 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8sbd\" (UniqueName: \"kubernetes.io/projected/d7be37fc-7374-46ae-a0e3-1cafab3430ec-kube-api-access-v8sbd\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617534 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-stats-auth\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617554 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8lvp\" (UniqueName: \"kubernetes.io/projected/5bad766d-e524-4670-b353-56e92df2f744-kube-api-access-q8lvp\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617574 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj6d5\" (UniqueName: \"kubernetes.io/projected/01e24add-292c-4a3c-8a32-75ceb16ced89-kube-api-access-hj6d5\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617590 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-service-ca\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617609 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-audit-dir\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617625 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d49febe3-b867-419d-955c-a9a7b0a658c3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617646 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d49febe3-b867-419d-955c-a9a7b0a658c3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617669 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc476661-200c-4feb-8b45-fe203009356a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617687 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/afd002b9-3309-4286-a7cb-29c2f4817feb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fppws\" (UID: \"afd002b9-3309-4286-a7cb-29c2f4817feb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617707 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6edf33c-728d-482f-ad5c-ceb85dae3b75-serving-cert\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617725 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bad766d-e524-4670-b353-56e92df2f744-serving-cert\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617744 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4eb68c-2986-486a-9b9a-4905bd264322-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617779 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bce3a1f-cd2e-41b9-b768-322f3ce72ed9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zthg4\" (UID: \"0bce3a1f-cd2e-41b9-b768-322f3ce72ed9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617802 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1a56343e-6342-4982-9ff1-8bae70d5771a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617821 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-client-ca\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617842 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a456460b-47a3-48ef-a98b-4f67709d5939-serving-cert\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617858 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-etcd-client\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617875 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-console-config\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617924 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.617939 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db0547ad-edd5-4b70-af0f-9f606793e6a9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.618266 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ccdtg"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.618875 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-policies\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.618890 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.618997 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-serving-cert\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619009 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.618947 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/d49febe3-b867-419d-955c-a9a7b0a658c3-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619060 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstnx\" (UniqueName: \"kubernetes.io/projected/be3d3f8d-f407-4b54-8e9b-a5b526babb52-kube-api-access-jstnx\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619089 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-audit-policies\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619133 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-image-import-ca\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619160 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-encryption-config\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619173 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-audit-dir\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619185 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d49febe3-b867-419d-955c-a9a7b0a658c3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619213 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/d49febe3-b867-419d-955c-a9a7b0a658c3-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619270 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5cab780-360c-48fa-9c88-36e15f37da47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619316 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619342 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619372 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvbr\" (UniqueName: \"kubernetes.io/projected/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-kube-api-access-qfvbr\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.619742 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-service-ca\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.620094 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d7be37fc-7374-46ae-a0e3-1cafab3430ec-audit-policies\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.620394 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d49febe3-b867-419d-955c-a9a7b0a658c3-service-ca\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.620500 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnnbr"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.621247 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-image-import-ca\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.622103 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.623906 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-config\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624226 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/afd002b9-3309-4286-a7cb-29c2f4817feb-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fppws\" (UID: \"afd002b9-3309-4286-a7cb-29c2f4817feb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624303 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-serving-cert\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624349 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624406 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/906f2138-1584-4399-8c87-d03f3231ddc7-config\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624433 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-client\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624463 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-srv-cert\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624477 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624492 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-dir\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624523 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkxwh\" (UniqueName: \"kubernetes.io/projected/b8e73047-6376-4bd9-8ec8-5966f8786e5d-kube-api-access-vkxwh\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624551 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-etcd-client\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624650 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-oauth-config\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624677 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-service-ca\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624703 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624727 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624780 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-config\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624807 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b44be4-8fc9-4b74-9339-8bb658d866fc-config\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.624995 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d49febe3-b867-419d-955c-a9a7b0a658c3-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.625344 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01e24add-292c-4a3c-8a32-75ceb16ced89-serving-cert\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.625374 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-encryption-config\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.625894 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/906f2138-1584-4399-8c87-d03f3231ddc7-config\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.626122 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.626188 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.626269 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-etcd-client\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.626551 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d3f8d-f407-4b54-8e9b-a5b526babb52-serving-cert\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628121 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-client\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.626840 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.627003 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b8e73047-6376-4bd9-8ec8-5966f8786e5d-images\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.627216 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37b44be4-8fc9-4b74-9339-8bb658d866fc-config\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.627461 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.627466 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-config\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.627521 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.627540 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-dir\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.627641 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.626655 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628407 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628412 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qxpq4"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628447 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc476661-200c-4feb-8b45-fe203009356a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628467 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nnrsp"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628483 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628496 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-oauth-serving-cert\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628732 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a56343e-6342-4982-9ff1-8bae70d5771a-images\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628808 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628851 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628870 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/906f2138-1584-4399-8c87-d03f3231ddc7-auth-proxy-config\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.628996 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc476661-200c-4feb-8b45-fe203009356a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.629065 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6edf33c-728d-482f-ad5c-ceb85dae3b75-config\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.629133 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfjp2\" (UniqueName: \"kubernetes.io/projected/a456460b-47a3-48ef-a98b-4f67709d5939-kube-api-access-jfjp2\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.629185 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8e73047-6376-4bd9-8ec8-5966f8786e5d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.629211 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-audit\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.629235 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5c1c8f7-5c07-4735-aa10-a7885f5bec8f-metrics-tls\") pod \"dns-operator-744455d44c-qxpq4\" (UID: \"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.629309 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfcgt\" (UniqueName: \"kubernetes.io/projected/afd002b9-3309-4286-a7cb-29c2f4817feb-kube-api-access-tfcgt\") pod \"cluster-samples-operator-665b6dd947-fppws\" (UID: \"afd002b9-3309-4286-a7cb-29c2f4817feb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.629448 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hndf6"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.629620 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.630192 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/906f2138-1584-4399-8c87-d03f3231ddc7-auth-proxy-config\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.630325 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc476661-200c-4feb-8b45-fe203009356a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.630329 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-node-pullsecrets\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631045 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxjp7\" (UniqueName: \"kubernetes.io/projected/906f2138-1584-4399-8c87-d03f3231ddc7-kube-api-access-pxjp7\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631079 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6edf33c-728d-482f-ad5c-ceb85dae3b75-trusted-ca\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631109 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5cab780-360c-48fa-9c88-36e15f37da47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631138 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7be37fc-7374-46ae-a0e3-1cafab3430ec-audit-dir\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631163 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsk45\" (UniqueName: \"kubernetes.io/projected/05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8-kube-api-access-rsk45\") pod \"downloads-7954f5f757-hndf6\" (UID: \"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8\") " pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.630377 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-node-pullsecrets\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631185 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-service-ca-bundle\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.630394 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631216 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-client-ca\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.630567 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-serving-cert\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631243 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a456460b-47a3-48ef-a98b-4f67709d5939-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.630779 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a456460b-47a3-48ef-a98b-4f67709d5939-serving-cert\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631269 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-serving-cert\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631299 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631307 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d7be37fc-7374-46ae-a0e3-1cafab3430ec-audit-dir\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631320 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-encryption-config\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631345 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rb9v\" (UniqueName: \"kubernetes.io/projected/761a509d-5cb8-4506-901c-614a7d633d39-kube-api-access-5rb9v\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631802 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631849 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a456460b-47a3-48ef-a98b-4f67709d5939-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631860 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.631911 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5ckj\" (UniqueName: \"kubernetes.io/projected/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-kube-api-access-g5ckj\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632196 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db0547ad-edd5-4b70-af0f-9f606793e6a9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632251 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632292 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-etcd-serving-ca\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632312 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/906f2138-1584-4399-8c87-d03f3231ddc7-machine-approver-tls\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632352 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbcs\" (UniqueName: \"kubernetes.io/projected/af4eb68c-2986-486a-9b9a-4905bd264322-kube-api-access-vjbcs\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632372 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-metrics-certs\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632398 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqj7q\" (UniqueName: \"kubernetes.io/projected/c6edf33c-728d-482f-ad5c-ceb85dae3b75-kube-api-access-jqj7q\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632427 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632445 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8e73047-6376-4bd9-8ec8-5966f8786e5d-config\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632451 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-client-ca\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632463 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37b44be4-8fc9-4b74-9339-8bb658d866fc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632498 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37b44be4-8fc9-4b74-9339-8bb658d866fc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632515 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-config\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632535 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqmz9\" (UniqueName: \"kubernetes.io/projected/1a56343e-6342-4982-9ff1-8bae70d5771a-kube-api-access-dqmz9\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632558 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-ca\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632607 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsxnh\" (UniqueName: \"kubernetes.io/projected/2b38e6ac-e12a-4798-b0e9-6321dc926487-kube-api-access-vsxnh\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632624 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxk9n\" (UniqueName: \"kubernetes.io/projected/0bce3a1f-cd2e-41b9-b768-322f3ce72ed9-kube-api-access-wxk9n\") pod \"control-plane-machine-set-operator-78cbb6b69f-zthg4\" (UID: \"0bce3a1f-cd2e-41b9-b768-322f3ce72ed9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.632641 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5cab780-360c-48fa-9c88-36e15f37da47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.633179 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-audit\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.633230 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-etcd-ca\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.633591 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8e73047-6376-4bd9-8ec8-5966f8786e5d-config\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.634390 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5c1c8f7-5c07-4735-aa10-a7885f5bec8f-metrics-tls\") pod \"dns-operator-744455d44c-qxpq4\" (UID: \"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.634520 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.634586 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.634611 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-njlsm"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.634917 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8e73047-6376-4bd9-8ec8-5966f8786e5d-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.635015 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01e24add-292c-4a3c-8a32-75ceb16ced89-config\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.635310 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-encryption-config\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.635368 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.635514 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-etcd-serving-ca\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.635642 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.635652 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-serving-cert\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.636254 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/906f2138-1584-4399-8c87-d03f3231ddc7-machine-approver-tls\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.636525 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v6p66"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.637628 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.638713 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.639749 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bnkx7"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.641123 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.641338 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.641488 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37b44be4-8fc9-4b74-9339-8bb658d866fc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.641510 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.641495 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d7be37fc-7374-46ae-a0e3-1cafab3430ec-etcd-client\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.642231 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.642644 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.643313 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.644451 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.645743 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.646721 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-r54j5"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.647676 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-54zxt"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.648811 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bpfvn"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.650123 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f8vb8"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.651295 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.652523 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ql9kq"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.653803 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pscld"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.654840 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s"] Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.675493 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.687088 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.702909 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.722973 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733637 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-default-certificate\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733678 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-serving-cert\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733706 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw4kb\" (UniqueName: \"kubernetes.io/projected/db0547ad-edd5-4b70-af0f-9f606793e6a9-kube-api-access-pw4kb\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733737 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflsr\" (UniqueName: \"kubernetes.io/projected/d5cab780-360c-48fa-9c88-36e15f37da47-kube-api-access-fflsr\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733814 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-config\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733856 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-stats-auth\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733878 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8lvp\" (UniqueName: \"kubernetes.io/projected/5bad766d-e524-4670-b353-56e92df2f744-kube-api-access-q8lvp\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733921 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6edf33c-728d-482f-ad5c-ceb85dae3b75-serving-cert\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733941 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bad766d-e524-4670-b353-56e92df2f744-serving-cert\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733971 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bce3a1f-cd2e-41b9-b768-322f3ce72ed9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zthg4\" (UID: \"0bce3a1f-cd2e-41b9-b768-322f3ce72ed9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.733995 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1a56343e-6342-4982-9ff1-8bae70d5771a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734018 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-client-ca\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734041 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4eb68c-2986-486a-9b9a-4905bd264322-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734062 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-console-config\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734083 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db0547ad-edd5-4b70-af0f-9f606793e6a9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734102 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-serving-cert\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734144 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5cab780-360c-48fa-9c88-36e15f37da47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734197 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-srv-cert\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734230 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-oauth-config\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734253 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-service-ca\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734275 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734298 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734324 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-oauth-serving-cert\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734348 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a56343e-6342-4982-9ff1-8bae70d5771a-images\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734371 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6edf33c-728d-482f-ad5c-ceb85dae3b75-config\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734420 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6edf33c-728d-482f-ad5c-ceb85dae3b75-trusted-ca\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734550 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-config\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.735888 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.734634 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1a56343e-6342-4982-9ff1-8bae70d5771a-auth-proxy-config\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.735495 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4eb68c-2986-486a-9b9a-4905bd264322-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736079 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-service-ca\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736204 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-oauth-serving-cert\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.735875 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5cab780-360c-48fa-9c88-36e15f37da47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736274 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsk45\" (UniqueName: \"kubernetes.io/projected/05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8-kube-api-access-rsk45\") pod \"downloads-7954f5f757-hndf6\" (UID: \"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8\") " pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736276 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-console-config\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736295 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-service-ca-bundle\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736416 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rb9v\" (UniqueName: \"kubernetes.io/projected/761a509d-5cb8-4506-901c-614a7d633d39-kube-api-access-5rb9v\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736454 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5ckj\" (UniqueName: \"kubernetes.io/projected/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-kube-api-access-g5ckj\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736491 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db0547ad-edd5-4b70-af0f-9f606793e6a9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736526 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbcs\" (UniqueName: \"kubernetes.io/projected/af4eb68c-2986-486a-9b9a-4905bd264322-kube-api-access-vjbcs\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736557 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-metrics-certs\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736585 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqj7q\" (UniqueName: \"kubernetes.io/projected/c6edf33c-728d-482f-ad5c-ceb85dae3b75-kube-api-access-jqj7q\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736648 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsxnh\" (UniqueName: \"kubernetes.io/projected/2b38e6ac-e12a-4798-b0e9-6321dc926487-kube-api-access-vsxnh\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736673 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqmz9\" (UniqueName: \"kubernetes.io/projected/1a56343e-6342-4982-9ff1-8bae70d5771a-kube-api-access-dqmz9\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736705 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxk9n\" (UniqueName: \"kubernetes.io/projected/0bce3a1f-cd2e-41b9-b768-322f3ce72ed9-kube-api-access-wxk9n\") pod \"control-plane-machine-set-operator-78cbb6b69f-zthg4\" (UID: \"0bce3a1f-cd2e-41b9-b768-322f3ce72ed9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736729 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5cab780-360c-48fa-9c88-36e15f37da47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736827 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a56343e-6342-4982-9ff1-8bae70d5771a-proxy-tls\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736867 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bad766d-e524-4670-b353-56e92df2f744-service-ca-bundle\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736958 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af4eb68c-2986-486a-9b9a-4905bd264322-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.736986 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b38e6ac-e12a-4798-b0e9-6321dc926487-service-ca-bundle\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.737098 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-config\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.737144 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-trusted-ca-bundle\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.737169 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-profile-collector-cert\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.737196 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4vqb\" (UniqueName: \"kubernetes.io/projected/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-kube-api-access-x4vqb\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.737408 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db0547ad-edd5-4b70-af0f-9f606793e6a9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.738034 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db0547ad-edd5-4b70-af0f-9f606793e6a9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.738580 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-trusted-ca-bundle\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.739833 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af4eb68c-2986-486a-9b9a-4905bd264322-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.739916 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5bad766d-e524-4670-b353-56e92df2f744-serving-cert\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.741217 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-serving-cert\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.741244 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-oauth-config\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.743580 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.763795 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.783227 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.804067 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.807901 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-serving-cert\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.824107 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.828956 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-config\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.842938 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.846093 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-client-ca\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.868441 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.877066 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.888599 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.898727 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5cab780-360c-48fa-9c88-36e15f37da47-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.903056 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.922718 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.927047 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.929534 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5cab780-360c-48fa-9c88-36e15f37da47-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.944003 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.949571 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-metrics-certs\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.963404 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.983159 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 11:05:02 crc kubenswrapper[4702]: I1203 11:05:02.988162 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-stats-auth\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.004024 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.008629 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b38e6ac-e12a-4798-b0e9-6321dc926487-service-ca-bundle\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.024157 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.040261 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2b38e6ac-e12a-4798-b0e9-6321dc926487-default-certificate\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.063305 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.083202 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.103693 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.123533 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.128357 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/0bce3a1f-cd2e-41b9-b768-322f3ce72ed9-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-zthg4\" (UID: \"0bce3a1f-cd2e-41b9-b768-322f3ce72ed9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.143091 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.145947 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1a56343e-6342-4982-9ff1-8bae70d5771a-images\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.163886 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.183868 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.191220 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a56343e-6342-4982-9ff1-8bae70d5771a-proxy-tls\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.204030 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.206060 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6edf33c-728d-482f-ad5c-ceb85dae3b75-config\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.223162 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.228533 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c6edf33c-728d-482f-ad5c-ceb85dae3b75-serving-cert\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.244070 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.270497 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.276911 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6edf33c-728d-482f-ad5c-ceb85dae3b75-trusted-ca\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.282884 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.303804 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.323580 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.352386 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.363043 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.382962 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.404830 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.425646 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.444120 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.463610 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.483353 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.503230 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.523375 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.541253 4702 request.go:700] Waited for 1.009948061s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dolm-operator-serviceaccount-dockercfg-rq7zk&limit=500&resourceVersion=0 Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.543081 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.562787 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.570526 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-srv-cert\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.583737 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.592639 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-profile-collector-cert\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.603315 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.642583 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.664655 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.683589 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.703463 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.723408 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.744192 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.762932 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.782968 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.803380 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.824319 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.843629 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.863686 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.883113 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.903648 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.924127 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.928153 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.928153 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.928518 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.944044 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.964303 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 11:05:03 crc kubenswrapper[4702]: I1203 11:05:03.983360 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.002908 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.023076 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.043242 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.062857 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.083211 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.104407 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.123419 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.151524 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.163666 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.183310 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.203911 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.223715 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.245024 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.263875 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.283080 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.303296 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.323459 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.343319 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.363819 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.382828 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.402279 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.423028 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.459368 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8sbd\" (UniqueName: \"kubernetes.io/projected/d7be37fc-7374-46ae-a0e3-1cafab3430ec-kube-api-access-v8sbd\") pod \"apiserver-7bbb656c7d-nx9wv\" (UID: \"d7be37fc-7374-46ae-a0e3-1cafab3430ec\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.463184 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.483081 4702 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.503831 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.538483 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qgm\" (UniqueName: \"kubernetes.io/projected/2c99e1fd-b0d0-418c-bb67-f638f06978f2-kube-api-access-r5qgm\") pod \"oauth-openshift-558db77b4-x5ghh\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.541316 4702 request.go:700] Waited for 1.922459746s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/serviceaccounts/openshift-kube-scheduler-operator/token Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.554567 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.557227 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc476661-200c-4feb-8b45-fe203009356a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-nmmt2\" (UID: \"fc476661-200c-4feb-8b45-fe203009356a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.579882 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj6d5\" (UniqueName: \"kubernetes.io/projected/01e24add-292c-4a3c-8a32-75ceb16ced89-kube-api-access-hj6d5\") pod \"etcd-operator-b45778765-g8wwh\" (UID: \"01e24add-292c-4a3c-8a32-75ceb16ced89\") " pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.599805 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d49febe3-b867-419d-955c-a9a7b0a658c3-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-trxrx\" (UID: \"d49febe3-b867-419d-955c-a9a7b0a658c3\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.618392 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstnx\" (UniqueName: \"kubernetes.io/projected/be3d3f8d-f407-4b54-8e9b-a5b526babb52-kube-api-access-jstnx\") pod \"route-controller-manager-6576b87f9c-s62lk\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.640792 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkxwh\" (UniqueName: \"kubernetes.io/projected/b8e73047-6376-4bd9-8ec8-5966f8786e5d-kube-api-access-vkxwh\") pod \"machine-api-operator-5694c8668f-gzc7n\" (UID: \"b8e73047-6376-4bd9-8ec8-5966f8786e5d\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.645718 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.658388 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfm8t\" (UniqueName: \"kubernetes.io/projected/e5c1c8f7-5c07-4735-aa10-a7885f5bec8f-kube-api-access-xfm8t\") pod \"dns-operator-744455d44c-qxpq4\" (UID: \"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.660726 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.685736 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.692218 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.694347 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvbr\" (UniqueName: \"kubernetes.io/projected/adc43b14-86cb-4ff5-b7fb-a9ba32cde631-kube-api-access-qfvbr\") pod \"apiserver-76f77b778f-5xk7q\" (UID: \"adc43b14-86cb-4ff5-b7fb-a9ba32cde631\") " pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.696788 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfjp2\" (UniqueName: \"kubernetes.io/projected/a456460b-47a3-48ef-a98b-4f67709d5939-kube-api-access-jfjp2\") pod \"openshift-config-operator-7777fb866f-fv5c5\" (UID: \"a456460b-47a3-48ef-a98b-4f67709d5939\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.712808 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.717800 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfcgt\" (UniqueName: \"kubernetes.io/projected/afd002b9-3309-4286-a7cb-29c2f4817feb-kube-api-access-tfcgt\") pod \"cluster-samples-operator-665b6dd947-fppws\" (UID: \"afd002b9-3309-4286-a7cb-29c2f4817feb\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.724803 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" Dec 03 11:05:04 crc kubenswrapper[4702]: W1203 11:05:04.727751 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd49febe3_b867_419d_955c_a9a7b0a658c3.slice/crio-93b925dac4b4b1e6ceaa6b534b1a7e206c61484f9686b875bdcd8009435a6630 WatchSource:0}: Error finding container 93b925dac4b4b1e6ceaa6b534b1a7e206c61484f9686b875bdcd8009435a6630: Status 404 returned error can't find the container with id 93b925dac4b4b1e6ceaa6b534b1a7e206c61484f9686b875bdcd8009435a6630 Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.739844 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxjp7\" (UniqueName: \"kubernetes.io/projected/906f2138-1584-4399-8c87-d03f3231ddc7-kube-api-access-pxjp7\") pod \"machine-approver-56656f9798-qzzlq\" (UID: \"906f2138-1584-4399-8c87-d03f3231ddc7\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.772139 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" event={"ID":"d49febe3-b867-419d-955c-a9a7b0a658c3","Type":"ContainerStarted","Data":"93b925dac4b4b1e6ceaa6b534b1a7e206c61484f9686b875bdcd8009435a6630"} Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.781330 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.785248 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37b44be4-8fc9-4b74-9339-8bb658d866fc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m6tdv\" (UID: \"37b44be4-8fc9-4b74-9339-8bb658d866fc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.785645 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw4kb\" (UniqueName: \"kubernetes.io/projected/db0547ad-edd5-4b70-af0f-9f606793e6a9-kube-api-access-pw4kb\") pod \"openshift-controller-manager-operator-756b6f6bc6-b8s4f\" (UID: \"db0547ad-edd5-4b70-af0f-9f606793e6a9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.799873 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflsr\" (UniqueName: \"kubernetes.io/projected/d5cab780-360c-48fa-9c88-36e15f37da47-kube-api-access-fflsr\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.802000 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.804162 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.818030 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8lvp\" (UniqueName: \"kubernetes.io/projected/5bad766d-e524-4670-b353-56e92df2f744-kube-api-access-q8lvp\") pod \"authentication-operator-69f744f599-v6p66\" (UID: \"5bad766d-e524-4670-b353-56e92df2f744\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.818314 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.846097 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.848079 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d5cab780-360c-48fa-9c88-36e15f37da47-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-pp87r\" (UID: \"d5cab780-360c-48fa-9c88-36e15f37da47\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.862591 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsk45\" (UniqueName: \"kubernetes.io/projected/05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8-kube-api-access-rsk45\") pod \"downloads-7954f5f757-hndf6\" (UID: \"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8\") " pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.884048 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rb9v\" (UniqueName: \"kubernetes.io/projected/761a509d-5cb8-4506-901c-614a7d633d39-kube-api-access-5rb9v\") pod \"console-f9d7485db-ccdtg\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.917577 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5ckj\" (UniqueName: \"kubernetes.io/projected/3895be1f-db04-45b3-bd8c-cf2ab8c2aa43-kube-api-access-g5ckj\") pod \"catalog-operator-68c6474976-fvzsf\" (UID: \"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.919704 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.941117 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.945151 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.950948 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsxnh\" (UniqueName: \"kubernetes.io/projected/2b38e6ac-e12a-4798-b0e9-6321dc926487-kube-api-access-vsxnh\") pod \"router-default-5444994796-x85q2\" (UID: \"2b38e6ac-e12a-4798-b0e9-6321dc926487\") " pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.951533 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbcs\" (UniqueName: \"kubernetes.io/projected/af4eb68c-2986-486a-9b9a-4905bd264322-kube-api-access-vjbcs\") pod \"openshift-apiserver-operator-796bbdcf4f-pg9xx\" (UID: \"af4eb68c-2986-486a-9b9a-4905bd264322\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.976253 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.977018 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqmz9\" (UniqueName: \"kubernetes.io/projected/1a56343e-6342-4982-9ff1-8bae70d5771a-kube-api-access-dqmz9\") pod \"machine-config-operator-74547568cd-rh69s\" (UID: \"1a56343e-6342-4982-9ff1-8bae70d5771a\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:04 crc kubenswrapper[4702]: I1203 11:05:04.992863 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxk9n\" (UniqueName: \"kubernetes.io/projected/0bce3a1f-cd2e-41b9-b768-322f3ce72ed9-kube-api-access-wxk9n\") pod \"control-plane-machine-set-operator-78cbb6b69f-zthg4\" (UID: \"0bce3a1f-cd2e-41b9-b768-322f3ce72ed9\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.031610 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.032124 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqj7q\" (UniqueName: \"kubernetes.io/projected/c6edf33c-728d-482f-ad5c-ceb85dae3b75-kube-api-access-jqj7q\") pod \"console-operator-58897d9998-nnrsp\" (UID: \"c6edf33c-728d-482f-ad5c-ceb85dae3b75\") " pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.033015 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4vqb\" (UniqueName: \"kubernetes.io/projected/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-kube-api-access-x4vqb\") pod \"controller-manager-879f6c89f-njlsm\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.043547 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.126172 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.126229 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.126882 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.135800 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.135989 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.136905 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.137547 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.137533092 podStartE2EDuration="4.137533092s" podCreationTimestamp="2025-12-03 11:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:05.136567327 +0000 UTC m=+88.972495791" watchObservedRunningTime="2025-12-03 11:05:05.137533092 +0000 UTC m=+88.973461556" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.145015 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.150572 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.165780 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.174402 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.184477 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.189260 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.228916 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-bound-sa-token\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.228966 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c31210aa-a99c-4f88-a496-9f61835b4445-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.228992 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229044 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-tls\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229063 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c31210aa-a99c-4f88-a496-9f61835b4445-metrics-tls\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229081 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-config\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229117 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-proxy-tls\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229135 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229157 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-trusted-ca\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229172 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229186 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b5lj\" (UniqueName: \"kubernetes.io/projected/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-kube-api-access-4b5lj\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229203 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86dwp\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-kube-api-access-86dwp\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229264 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229293 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229327 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-certificates\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229349 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sh87\" (UniqueName: \"kubernetes.io/projected/c31210aa-a99c-4f88-a496-9f61835b4445-kube-api-access-5sh87\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229384 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.229441 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c31210aa-a99c-4f88-a496-9f61835b4445-trusted-ca\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.231564 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:05.731545252 +0000 UTC m=+89.567473716 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.292082 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.337741 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.337992 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-csi-data-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338053 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0276c6fb-ba7a-459f-9610-34a03593669b-srv-cert\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338074 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0276c6fb-ba7a-459f-9610-34a03593669b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338137 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzkdc\" (UniqueName: \"kubernetes.io/projected/e85fa6d0-61c0-4d62-adeb-e2402e597d87-kube-api-access-fzkdc\") pod \"migrator-59844c95c7-rk4kf\" (UID: \"e85fa6d0-61c0-4d62-adeb-e2402e597d87\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338152 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-mountpoint-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338219 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338310 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36516401-b0b6-42c4-b444-84ee4e336839-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bnkx7\" (UID: \"36516401-b0b6-42c4-b444-84ee4e336839\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338331 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee7cd19b-47c8-464d-9156-69d796be8866-signing-key\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338347 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-plugins-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338388 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c31210aa-a99c-4f88-a496-9f61835b4445-trusted-ca\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338417 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3de04148-0009-427b-8055-a1c5dadb8274-tmpfs\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338462 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-socket-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338489 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcdjp\" (UniqueName: \"kubernetes.io/projected/77a69d25-2384-466b-b284-e36e979597b4-kube-api-access-zcdjp\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338517 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2535e580-a150-4c92-912f-142f212faebd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338536 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a49f8d97-9fa5-44b6-bd39-e35d4d70b33c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vchd7\" (UID: \"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338555 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6t46\" (UniqueName: \"kubernetes.io/projected/a49f8d97-9fa5-44b6-bd39-e35d4d70b33c-kube-api-access-d6t46\") pod \"package-server-manager-789f6589d5-vchd7\" (UID: \"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338571 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm685\" (UniqueName: \"kubernetes.io/projected/ee7cd19b-47c8-464d-9156-69d796be8866-kube-api-access-mm685\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338589 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpbbp\" (UniqueName: \"kubernetes.io/projected/3de04148-0009-427b-8055-a1c5dadb8274-kube-api-access-tpbbp\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338688 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-bound-sa-token\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338705 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qb4fg\" (UniqueName: \"kubernetes.io/projected/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-kube-api-access-qb4fg\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338736 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68dr2\" (UniqueName: \"kubernetes.io/projected/042c7f5b-da64-4f42-a2b2-58d04b73c12a-kube-api-access-68dr2\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338774 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c31210aa-a99c-4f88-a496-9f61835b4445-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338791 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-serving-cert\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338827 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b16b76f2-35bb-482e-a01c-bb873ea433d9-certs\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338845 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338875 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-config\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338931 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-tls\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338947 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c31210aa-a99c-4f88-a496-9f61835b4445-metrics-tls\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.338963 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-registration-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.339001 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-config\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.339017 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a69d25-2384-466b-b284-e36e979597b4-secret-volume\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.339034 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a69d25-2384-466b-b284-e36e979597b4-config-volume\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.339107 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e9a6a2d-0a6d-42a1-a44b-a82ee572b995-cert\") pod \"ingress-canary-bpfvn\" (UID: \"7e9a6a2d-0a6d-42a1-a44b-a82ee572b995\") " pod="openshift-ingress-canary/ingress-canary-bpfvn" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.340240 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c31210aa-a99c-4f88-a496-9f61835b4445-trusted-ca\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.340530 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:05.840511021 +0000 UTC m=+89.676439485 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.341827 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vzhk\" (UniqueName: \"kubernetes.io/projected/47af44a8-4e95-4edf-81b4-efbfdab9447a-kube-api-access-4vzhk\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.341913 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b16b76f2-35bb-482e-a01c-bb873ea433d9-node-bootstrap-token\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.341935 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9dts\" (UniqueName: \"kubernetes.io/projected/2535e580-a150-4c92-912f-142f212faebd-kube-api-access-s9dts\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.342038 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.342084 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tvkh\" (UniqueName: \"kubernetes.io/projected/0276c6fb-ba7a-459f-9610-34a03593669b-kube-api-access-9tvkh\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.342135 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-proxy-tls\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.342177 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rpqq\" (UniqueName: \"kubernetes.io/projected/7e9a6a2d-0a6d-42a1-a44b-a82ee572b995-kube-api-access-9rpqq\") pod \"ingress-canary-bpfvn\" (UID: \"7e9a6a2d-0a6d-42a1-a44b-a82ee572b995\") " pod="openshift-ingress-canary/ingress-canary-bpfvn" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343448 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-config\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343489 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2535e580-a150-4c92-912f-142f212faebd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343508 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343516 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee7cd19b-47c8-464d-9156-69d796be8866-signing-cabundle\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343560 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47af44a8-4e95-4edf-81b4-efbfdab9447a-metrics-tls\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343609 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-trusted-ca\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343663 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343683 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4b5lj\" (UniqueName: \"kubernetes.io/projected/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-kube-api-access-4b5lj\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.343704 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbh9k\" (UniqueName: \"kubernetes.io/projected/36516401-b0b6-42c4-b444-84ee4e336839-kube-api-access-pbh9k\") pod \"multus-admission-controller-857f4d67dd-bnkx7\" (UID: \"36516401-b0b6-42c4-b444-84ee4e336839\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344054 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47af44a8-4e95-4edf-81b4-efbfdab9447a-config-volume\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344154 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86dwp\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-kube-api-access-86dwp\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344191 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3de04148-0009-427b-8055-a1c5dadb8274-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344396 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344611 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344644 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3de04148-0009-427b-8055-a1c5dadb8274-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344697 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szbbg\" (UniqueName: \"kubernetes.io/projected/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-kube-api-access-szbbg\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344748 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-certificates\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344948 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sf62\" (UniqueName: \"kubernetes.io/projected/b16b76f2-35bb-482e-a01c-bb873ea433d9-kube-api-access-4sf62\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.344992 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.345021 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sh87\" (UniqueName: \"kubernetes.io/projected/c31210aa-a99c-4f88-a496-9f61835b4445-kube-api-access-5sh87\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.346488 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.347524 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-certificates\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.354948 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-trusted-ca\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.361923 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-proxy-tls\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.362259 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c31210aa-a99c-4f88-a496-9f61835b4445-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.362416 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.363675 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-tls\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.427202 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.441508 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ebf7fe4a-d38b-4e14-b395-6b2de24c43d0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dw786\" (UID: \"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446160 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446252 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36516401-b0b6-42c4-b444-84ee4e336839-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bnkx7\" (UID: \"36516401-b0b6-42c4-b444-84ee4e336839\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446287 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-plugins-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446321 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee7cd19b-47c8-464d-9156-69d796be8866-signing-key\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446345 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3de04148-0009-427b-8055-a1c5dadb8274-tmpfs\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446371 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-socket-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446396 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcdjp\" (UniqueName: \"kubernetes.io/projected/77a69d25-2384-466b-b284-e36e979597b4-kube-api-access-zcdjp\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446418 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a49f8d97-9fa5-44b6-bd39-e35d4d70b33c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vchd7\" (UID: \"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446443 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2535e580-a150-4c92-912f-142f212faebd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446463 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6t46\" (UniqueName: \"kubernetes.io/projected/a49f8d97-9fa5-44b6-bd39-e35d4d70b33c-kube-api-access-d6t46\") pod \"package-server-manager-789f6589d5-vchd7\" (UID: \"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446490 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm685\" (UniqueName: \"kubernetes.io/projected/ee7cd19b-47c8-464d-9156-69d796be8866-kube-api-access-mm685\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446513 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpbbp\" (UniqueName: \"kubernetes.io/projected/3de04148-0009-427b-8055-a1c5dadb8274-kube-api-access-tpbbp\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446549 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qb4fg\" (UniqueName: \"kubernetes.io/projected/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-kube-api-access-qb4fg\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446570 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68dr2\" (UniqueName: \"kubernetes.io/projected/042c7f5b-da64-4f42-a2b2-58d04b73c12a-kube-api-access-68dr2\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446618 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-serving-cert\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446641 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b16b76f2-35bb-482e-a01c-bb873ea433d9-certs\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446661 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-config\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446698 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-registration-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446720 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a69d25-2384-466b-b284-e36e979597b4-secret-volume\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446739 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e9a6a2d-0a6d-42a1-a44b-a82ee572b995-cert\") pod \"ingress-canary-bpfvn\" (UID: \"7e9a6a2d-0a6d-42a1-a44b-a82ee572b995\") " pod="openshift-ingress-canary/ingress-canary-bpfvn" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446738 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-plugins-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446778 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a69d25-2384-466b-b284-e36e979597b4-config-volume\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446959 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b16b76f2-35bb-482e-a01c-bb873ea433d9-node-bootstrap-token\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446991 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vzhk\" (UniqueName: \"kubernetes.io/projected/47af44a8-4e95-4edf-81b4-efbfdab9447a-kube-api-access-4vzhk\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.446992 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-socket-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447022 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9dts\" (UniqueName: \"kubernetes.io/projected/2535e580-a150-4c92-912f-142f212faebd-kube-api-access-s9dts\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447082 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tvkh\" (UniqueName: \"kubernetes.io/projected/0276c6fb-ba7a-459f-9610-34a03593669b-kube-api-access-9tvkh\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447128 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rpqq\" (UniqueName: \"kubernetes.io/projected/7e9a6a2d-0a6d-42a1-a44b-a82ee572b995-kube-api-access-9rpqq\") pod \"ingress-canary-bpfvn\" (UID: \"7e9a6a2d-0a6d-42a1-a44b-a82ee572b995\") " pod="openshift-ingress-canary/ingress-canary-bpfvn" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447156 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2535e580-a150-4c92-912f-142f212faebd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447182 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee7cd19b-47c8-464d-9156-69d796be8866-signing-cabundle\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447218 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47af44a8-4e95-4edf-81b4-efbfdab9447a-metrics-tls\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447248 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47af44a8-4e95-4edf-81b4-efbfdab9447a-config-volume\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447278 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbh9k\" (UniqueName: \"kubernetes.io/projected/36516401-b0b6-42c4-b444-84ee4e336839-kube-api-access-pbh9k\") pod \"multus-admission-controller-857f4d67dd-bnkx7\" (UID: \"36516401-b0b6-42c4-b444-84ee4e336839\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447299 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3de04148-0009-427b-8055-a1c5dadb8274-tmpfs\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447316 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3de04148-0009-427b-8055-a1c5dadb8274-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447386 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3de04148-0009-427b-8055-a1c5dadb8274-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447417 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szbbg\" (UniqueName: \"kubernetes.io/projected/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-kube-api-access-szbbg\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447444 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sf62\" (UniqueName: \"kubernetes.io/projected/b16b76f2-35bb-482e-a01c-bb873ea433d9-kube-api-access-4sf62\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447468 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447500 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-csi-data-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447520 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0276c6fb-ba7a-459f-9610-34a03593669b-srv-cert\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447539 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0276c6fb-ba7a-459f-9610-34a03593669b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447601 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzkdc\" (UniqueName: \"kubernetes.io/projected/e85fa6d0-61c0-4d62-adeb-e2402e597d87-kube-api-access-fzkdc\") pod \"migrator-59844c95c7-rk4kf\" (UID: \"e85fa6d0-61c0-4d62-adeb-e2402e597d87\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447625 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-mountpoint-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447656 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.447806 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-csi-data-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.448047 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:05.948014986 +0000 UTC m=+89.783943450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.449360 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2535e580-a150-4c92-912f-142f212faebd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.450350 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.451315 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-mountpoint-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.451610 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/47af44a8-4e95-4edf-81b4-efbfdab9447a-config-volume\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.451674 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/042c7f5b-da64-4f42-a2b2-58d04b73c12a-registration-dir\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.452772 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-config\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.460157 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a69d25-2384-466b-b284-e36e979597b4-config-volume\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.460788 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-bound-sa-token\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.461675 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c31210aa-a99c-4f88-a496-9f61835b4445-metrics-tls\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.461773 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a49f8d97-9fa5-44b6-bd39-e35d4d70b33c-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vchd7\" (UID: \"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.461808 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/36516401-b0b6-42c4-b444-84ee4e336839-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bnkx7\" (UID: \"36516401-b0b6-42c4-b444-84ee4e336839\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.461976 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ee7cd19b-47c8-464d-9156-69d796be8866-signing-cabundle\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: W1203 11:05:05.462499 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b38e6ac_e12a_4798_b0e9_6321dc926487.slice/crio-0779be1a36b7d6effb4777fe8adb15707957166d1d04f1f7ce89e15647554774 WatchSource:0}: Error finding container 0779be1a36b7d6effb4777fe8adb15707957166d1d04f1f7ce89e15647554774: Status 404 returned error can't find the container with id 0779be1a36b7d6effb4777fe8adb15707957166d1d04f1f7ce89e15647554774 Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.462645 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86dwp\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-kube-api-access-86dwp\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.462817 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b5lj\" (UniqueName: \"kubernetes.io/projected/ec4c6214-773a-4bcd-ae3e-02a4d74b791a-kube-api-access-4b5lj\") pod \"machine-config-controller-84d6567774-x92bs\" (UID: \"ec4c6214-773a-4bcd-ae3e-02a4d74b791a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.463870 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sh87\" (UniqueName: \"kubernetes.io/projected/c31210aa-a99c-4f88-a496-9f61835b4445-kube-api-access-5sh87\") pod \"ingress-operator-5b745b69d9-x4pq9\" (UID: \"c31210aa-a99c-4f88-a496-9f61835b4445\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.466610 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2535e580-a150-4c92-912f-142f212faebd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.468651 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3de04148-0009-427b-8055-a1c5dadb8274-apiservice-cert\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.470444 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-serving-cert\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.471622 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3de04148-0009-427b-8055-a1c5dadb8274-webhook-cert\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.472082 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/0276c6fb-ba7a-459f-9610-34a03593669b-srv-cert\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.472970 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b16b76f2-35bb-482e-a01c-bb873ea433d9-node-bootstrap-token\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.474686 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ee7cd19b-47c8-464d-9156-69d796be8866-signing-key\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.474943 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e9a6a2d-0a6d-42a1-a44b-a82ee572b995-cert\") pod \"ingress-canary-bpfvn\" (UID: \"7e9a6a2d-0a6d-42a1-a44b-a82ee572b995\") " pod="openshift-ingress-canary/ingress-canary-bpfvn" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.475583 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/0276c6fb-ba7a-459f-9610-34a03593669b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.476076 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47af44a8-4e95-4edf-81b4-efbfdab9447a-metrics-tls\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.476157 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpbbp\" (UniqueName: \"kubernetes.io/projected/3de04148-0009-427b-8055-a1c5dadb8274-kube-api-access-tpbbp\") pod \"packageserver-d55dfcdfc-6kdsp\" (UID: \"3de04148-0009-427b-8055-a1c5dadb8274\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.476613 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.479209 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a69d25-2384-466b-b284-e36e979597b4-secret-volume\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.480092 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b16b76f2-35bb-482e-a01c-bb873ea433d9-certs\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.488334 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcdjp\" (UniqueName: \"kubernetes.io/projected/77a69d25-2384-466b-b284-e36e979597b4-kube-api-access-zcdjp\") pod \"collect-profiles-29412660-4lj4n\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.497294 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.495298 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6t46\" (UniqueName: \"kubernetes.io/projected/a49f8d97-9fa5-44b6-bd39-e35d4d70b33c-kube-api-access-d6t46\") pod \"package-server-manager-789f6589d5-vchd7\" (UID: \"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.502271 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm685\" (UniqueName: \"kubernetes.io/projected/ee7cd19b-47c8-464d-9156-69d796be8866-kube-api-access-mm685\") pod \"service-ca-9c57cc56f-54zxt\" (UID: \"ee7cd19b-47c8-464d-9156-69d796be8866\") " pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.505701 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.521307 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rpqq\" (UniqueName: \"kubernetes.io/projected/7e9a6a2d-0a6d-42a1-a44b-a82ee572b995-kube-api-access-9rpqq\") pod \"ingress-canary-bpfvn\" (UID: \"7e9a6a2d-0a6d-42a1-a44b-a82ee572b995\") " pod="openshift-ingress-canary/ingress-canary-bpfvn" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.537730 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qxpq4"] Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.540855 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9dts\" (UniqueName: \"kubernetes.io/projected/2535e580-a150-4c92-912f-142f212faebd-kube-api-access-s9dts\") pod \"kube-storage-version-migrator-operator-b67b599dd-s6wcg\" (UID: \"2535e580-a150-4c92-912f-142f212faebd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.551450 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.551938 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.051902118 +0000 UTC m=+89.887830582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.559236 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vzhk\" (UniqueName: \"kubernetes.io/projected/47af44a8-4e95-4edf-81b4-efbfdab9447a-kube-api-access-4vzhk\") pod \"dns-default-f8vb8\" (UID: \"47af44a8-4e95-4edf-81b4-efbfdab9447a\") " pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.563334 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.563342 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.570553 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.577107 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.586144 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.609047 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sf62\" (UniqueName: \"kubernetes.io/projected/b16b76f2-35bb-482e-a01c-bb873ea433d9-kube-api-access-4sf62\") pod \"machine-config-server-nqtkf\" (UID: \"b16b76f2-35bb-482e-a01c-bb873ea433d9\") " pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.609735 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szbbg\" (UniqueName: \"kubernetes.io/projected/28d79fc8-1fa6-4a05-aede-e9d0967dfb3b-kube-api-access-szbbg\") pod \"service-ca-operator-777779d784-r54j5\" (UID: \"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.617584 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-g8wwh"] Dec 03 11:05:05 crc kubenswrapper[4702]: W1203 11:05:05.618079 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5c1c8f7_5c07_4735_aa10_a7885f5bec8f.slice/crio-c4819da7c72c6929605e6ae8d5a9dc0c241dc441f6a7d57cf120a2d3c8a7240f WatchSource:0}: Error finding container c4819da7c72c6929605e6ae8d5a9dc0c241dc441f6a7d57cf120a2d3c8a7240f: Status 404 returned error can't find the container with id c4819da7c72c6929605e6ae8d5a9dc0c241dc441f6a7d57cf120a2d3c8a7240f Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.634721 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.639845 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tvkh\" (UniqueName: \"kubernetes.io/projected/0276c6fb-ba7a-459f-9610-34a03593669b-kube-api-access-9tvkh\") pod \"olm-operator-6b444d44fb-xxj8s\" (UID: \"0276c6fb-ba7a-459f-9610-34a03593669b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.648384 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bpfvn" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.651294 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-nqtkf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.652427 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.652982 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.152969975 +0000 UTC m=+89.988898439 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.658345 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbh9k\" (UniqueName: \"kubernetes.io/projected/36516401-b0b6-42c4-b444-84ee4e336839-kube-api-access-pbh9k\") pod \"multus-admission-controller-857f4d67dd-bnkx7\" (UID: \"36516401-b0b6-42c4-b444-84ee4e336839\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.679339 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzkdc\" (UniqueName: \"kubernetes.io/projected/e85fa6d0-61c0-4d62-adeb-e2402e597d87-kube-api-access-fzkdc\") pod \"migrator-59844c95c7-rk4kf\" (UID: \"e85fa6d0-61c0-4d62-adeb-e2402e597d87\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.705522 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68dr2\" (UniqueName: \"kubernetes.io/projected/042c7f5b-da64-4f42-a2b2-58d04b73c12a-kube-api-access-68dr2\") pod \"csi-hostpathplugin-pscld\" (UID: \"042c7f5b-da64-4f42-a2b2-58d04b73c12a\") " pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.725852 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5ghh"] Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.727372 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.739134 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qb4fg\" (UniqueName: \"kubernetes.io/projected/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-kube-api-access-qb4fg\") pod \"marketplace-operator-79b997595-ql9kq\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.755919 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.756121 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.25608675 +0000 UTC m=+90.092015274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.756191 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.756670 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.256660191 +0000 UTC m=+90.092588655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.860447 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" event={"ID":"d49febe3-b867-419d-955c-a9a7b0a658c3","Type":"ContainerStarted","Data":"e72010909b51b5fd123e47ea7cca7f18b423a6c2046a1d002a9dcdea1eb37415"} Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.861707 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.862855 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.362407341 +0000 UTC m=+90.198335805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.864610 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.868793 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x85q2" event={"ID":"2b38e6ac-e12a-4798-b0e9-6321dc926487","Type":"ContainerStarted","Data":"a0c95ae1cc071401df4dbae2a9b4cb0862a47a5d82d5152774e1cd848052af53"} Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.868851 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x85q2" event={"ID":"2b38e6ac-e12a-4798-b0e9-6321dc926487","Type":"ContainerStarted","Data":"0779be1a36b7d6effb4777fe8adb15707957166d1d04f1f7ce89e15647554774"} Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.870572 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" event={"ID":"906f2138-1584-4399-8c87-d03f3231ddc7","Type":"ContainerStarted","Data":"b12b8d31540ea9d9ee06623845ea9b89649754657e0e87fa551c4241c05ecc4a"} Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.870698 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" event={"ID":"906f2138-1584-4399-8c87-d03f3231ddc7","Type":"ContainerStarted","Data":"dcf07da8e612defa003808b9633ab1e1f592d7d1d526fcbe5f45ce884a59891d"} Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.871913 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" event={"ID":"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f","Type":"ContainerStarted","Data":"c4819da7c72c6929605e6ae8d5a9dc0c241dc441f6a7d57cf120a2d3c8a7240f"} Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.892517 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.903699 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.911820 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.925924 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.963952 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:05 crc kubenswrapper[4702]: E1203 11:05:05.964493 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.464467226 +0000 UTC m=+90.300395740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:05 crc kubenswrapper[4702]: I1203 11:05:05.976852 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.065498 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.065700 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.56566963 +0000 UTC m=+90.401598104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.066044 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.066743 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.566729889 +0000 UTC m=+90.402658423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.167948 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.168686 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.66866724 +0000 UTC m=+90.504595704 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.170916 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.265306 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws"] Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.269942 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv"] Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.271818 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.272286 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.772270292 +0000 UTC m=+90.608198756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.274954 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk"] Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.372679 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.372881 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.872849082 +0000 UTC m=+90.708777546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.373071 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.373515 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.873506516 +0000 UTC m=+90.709434980 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.474164 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.474445 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:06.97443014 +0000 UTC m=+90.810358604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.575712 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.576696 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.076677042 +0000 UTC m=+90.912605506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.666985 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:06 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:06 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:06 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.667056 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.677458 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.677825 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.177809943 +0000 UTC m=+91.013738407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.789816 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.790397 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.290381444 +0000 UTC m=+91.126309908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:06 crc kubenswrapper[4702]: I1203 11:05:06.899686 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:06 crc kubenswrapper[4702]: E1203 11:05:06.900179 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.400161333 +0000 UTC m=+91.236089797 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.003732 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.005224 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.505198177 +0000 UTC m=+91.341126641 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100193 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" event={"ID":"906f2138-1584-4399-8c87-d03f3231ddc7","Type":"ContainerStarted","Data":"576efa62dc44a147a1b481a1d2b720ce32303c0f996c59b68600bfbf07842978"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100259 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" event={"ID":"d7be37fc-7374-46ae-a0e3-1cafab3430ec","Type":"ContainerStarted","Data":"d8922a7bc6a9dcdaedb2069f193fe8774fee4bfdfdbef53aec1ea143ec30abf7"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100274 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100291 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" event={"ID":"2c99e1fd-b0d0-418c-bb67-f638f06978f2","Type":"ContainerStarted","Data":"47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100303 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" event={"ID":"2c99e1fd-b0d0-418c-bb67-f638f06978f2","Type":"ContainerStarted","Data":"78a64ebe5201bc8463e35c3756c37f28d62fba9956894c5dbb7fbeeee9158870"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100313 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nqtkf" event={"ID":"b16b76f2-35bb-482e-a01c-bb873ea433d9","Type":"ContainerStarted","Data":"5a640b106db51a4dbbe7dc184c2e7f76977bd866268129d82fdeb91233aa82f6"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100323 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-nqtkf" event={"ID":"b16b76f2-35bb-482e-a01c-bb873ea433d9","Type":"ContainerStarted","Data":"05dc252a6aee60b6aa7042efff5f613a82bbb4b744c51364183f29724ea8f5c4"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100336 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" event={"ID":"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f","Type":"ContainerStarted","Data":"c9376ea01f4afc44ee2a89a7ecde144accb88d30dcdf781c3fe23f07bde1899b"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100350 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" event={"ID":"01e24add-292c-4a3c-8a32-75ceb16ced89","Type":"ContainerStarted","Data":"3c736beb0a94c850d81f9e03417faca41e73b0a0594aa5044fb92ee81e9ac902"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.100360 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" event={"ID":"01e24add-292c-4a3c-8a32-75ceb16ced89","Type":"ContainerStarted","Data":"b9826ef68ee0d35bd968d5472abea614e38a01804c8f07a3553788bebcb63b31"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.101407 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" event={"ID":"be3d3f8d-f407-4b54-8e9b-a5b526babb52","Type":"ContainerStarted","Data":"9235ef0f9c65cc4eb3dc8d9645aeec798b539946d44c9abfcff3b1e85a226017"} Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.109411 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.109495 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.609477154 +0000 UTC m=+91.445405618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.110424 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.113148 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.613127978 +0000 UTC m=+91.449056442 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.163933 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gzc7n"] Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.211848 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.212303 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.712267556 +0000 UTC m=+91.548196020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.212699 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.213210 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.71320208 +0000 UTC m=+91.549130544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.290918 4702 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-x5ghh container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" start-of-body= Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.291148 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" podUID="2c99e1fd-b0d0-418c-bb67-f638f06978f2" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.6:6443/healthz\": dial tcp 10.217.0.6:6443: connect: connection refused" Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.306516 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-g8wwh" podStartSLOduration=68.306491573 podStartE2EDuration="1m8.306491573s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:07.305126183 +0000 UTC m=+91.141054647" watchObservedRunningTime="2025-12-03 11:05:07.306491573 +0000 UTC m=+91.142420037" Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.314342 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.314972 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.814950884 +0000 UTC m=+91.650879348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.416626 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.418027 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:07.918011926 +0000 UTC m=+91.753940390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.422803 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-qzzlq" podStartSLOduration=69.422749789 podStartE2EDuration="1m9.422749789s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:07.416456738 +0000 UTC m=+91.252385212" watchObservedRunningTime="2025-12-03 11:05:07.422749789 +0000 UTC m=+91.258678263" Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.500791 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" podStartSLOduration=69.500753022 podStartE2EDuration="1m9.500753022s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:07.497270994 +0000 UTC m=+91.333199458" watchObservedRunningTime="2025-12-03 11:05:07.500753022 +0000 UTC m=+91.336681496" Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.518283 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.518675 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.018654379 +0000 UTC m=+91.854582853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.537082 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-nqtkf" podStartSLOduration=5.537063314 podStartE2EDuration="5.537063314s" podCreationTimestamp="2025-12-03 11:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:07.536854537 +0000 UTC m=+91.372783001" watchObservedRunningTime="2025-12-03 11:05:07.537063314 +0000 UTC m=+91.372991778" Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.618072 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-x85q2" podStartSLOduration=68.618055306 podStartE2EDuration="1m8.618055306s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:07.615254193 +0000 UTC m=+91.451182667" watchObservedRunningTime="2025-12-03 11:05:07.618055306 +0000 UTC m=+91.453983770" Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.619516 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.620099 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.120084141 +0000 UTC m=+91.956012605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.720583 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.721035 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.221018365 +0000 UTC m=+92.056946829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.740256 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:07 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:07 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:07 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.740323 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.827955 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.828400 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.328380004 +0000 UTC m=+92.164308488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:07 crc kubenswrapper[4702]: I1203 11:05:07.931181 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:07 crc kubenswrapper[4702]: E1203 11:05:07.931694 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.431676935 +0000 UTC m=+92.267605399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.035787 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.037861 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.537839691 +0000 UTC m=+92.373768155 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.120285 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" event={"ID":"e5c1c8f7-5c07-4735-aa10-a7885f5bec8f","Type":"ContainerStarted","Data":"9dd742feff708c1703e30a24e8b9eef434d48bdaac85deca96fd1d730860b5dd"} Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.131103 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" event={"ID":"be3d3f8d-f407-4b54-8e9b-a5b526babb52","Type":"ContainerStarted","Data":"ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824"} Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.131851 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.139557 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" event={"ID":"b8e73047-6376-4bd9-8ec8-5966f8786e5d","Type":"ContainerStarted","Data":"4a617f2477373e1396bd98e1a373d38f385e333bbb9b2a9afc28cd37cf533c13"} Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.140367 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.140518 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.640497498 +0000 UTC m=+92.476425962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.140780 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.141088 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.641081219 +0000 UTC m=+92.477009683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.144482 4702 generic.go:334] "Generic (PLEG): container finished" podID="d7be37fc-7374-46ae-a0e3-1cafab3430ec" containerID="79dc5aa08bbd876ddebf99064f8e9f280fe653bb8a73230bd8366d7ebd13e567" exitCode=0 Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.144547 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" event={"ID":"d7be37fc-7374-46ae-a0e3-1cafab3430ec","Type":"ContainerDied","Data":"79dc5aa08bbd876ddebf99064f8e9f280fe653bb8a73230bd8366d7ebd13e567"} Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.149908 4702 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-s62lk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.149960 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" podUID="be3d3f8d-f407-4b54-8e9b-a5b526babb52" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.153278 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" event={"ID":"afd002b9-3309-4286-a7cb-29c2f4817feb","Type":"ContainerStarted","Data":"996e883e67868e791c28b8424ceff0de8bb5d64a61bc83aa791025f821b72e3c"} Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.156904 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" event={"ID":"afd002b9-3309-4286-a7cb-29c2f4817feb","Type":"ContainerStarted","Data":"cfce9c87f35fef9624c2438a22c5a9fba1806fc9663d169d38c0671da8ff1303"} Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.165077 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-trxrx" podStartSLOduration=70.165052789 podStartE2EDuration="1m10.165052789s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:07.667098346 +0000 UTC m=+91.503026810" watchObservedRunningTime="2025-12-03 11:05:08.165052789 +0000 UTC m=+92.000981253" Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.171574 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:08 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:08 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:08 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.171657 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.185257 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.203028 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qxpq4" podStartSLOduration=69.203002121 podStartE2EDuration="1m9.203002121s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:08.164215638 +0000 UTC m=+92.000144102" watchObservedRunningTime="2025-12-03 11:05:08.203002121 +0000 UTC m=+92.038930605" Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.242495 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.243731 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.743706895 +0000 UTC m=+92.579635409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.293314 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" podStartSLOduration=70.293287935 podStartE2EDuration="1m10.293287935s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:08.292190574 +0000 UTC m=+92.128119038" watchObservedRunningTime="2025-12-03 11:05:08.293287935 +0000 UTC m=+92.129216399" Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.293587 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" podStartSLOduration=69.293578445 podStartE2EDuration="1m9.293578445s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:08.250185913 +0000 UTC m=+92.086114387" watchObservedRunningTime="2025-12-03 11:05:08.293578445 +0000 UTC m=+92.129506909" Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.346596 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.347348 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.847331478 +0000 UTC m=+92.683259952 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.434072 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2"] Dec 03 11:05:08 crc kubenswrapper[4702]: W1203 11:05:08.446927 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc476661_200c_4feb_8b45_fe203009356a.slice/crio-677c65da17956af5a528e4c158485745d148f49a104fd894ce05babf7b01512b WatchSource:0}: Error finding container 677c65da17956af5a528e4c158485745d148f49a104fd894ce05babf7b01512b: Status 404 returned error can't find the container with id 677c65da17956af5a528e4c158485745d148f49a104fd894ce05babf7b01512b Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.447022 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.447572 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.447737 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.947701801 +0000 UTC m=+92.783630265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.447875 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.448857 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:08.948847273 +0000 UTC m=+92.784775737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.463892 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5xk7q"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.482794 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5"] Dec 03 11:05:08 crc kubenswrapper[4702]: W1203 11:05:08.513228 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda456460b_47a3_48ef_a98b_4f67709d5939.slice/crio-03b2338581b8411dabf00e25adfe2ea5bc9167179c5c6b638da9cd71a0a81386 WatchSource:0}: Error finding container 03b2338581b8411dabf00e25adfe2ea5bc9167179c5c6b638da9cd71a0a81386: Status 404 returned error can't find the container with id 03b2338581b8411dabf00e25adfe2ea5bc9167179c5c6b638da9cd71a0a81386 Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.549775 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.550171 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.050114209 +0000 UTC m=+92.886042663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.607775 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-v6p66"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.651317 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.651689 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.151673756 +0000 UTC m=+92.987602220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.661009 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f"] Dec 03 11:05:08 crc kubenswrapper[4702]: W1203 11:05:08.683362 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bad766d_e524_4670_b353_56e92df2f744.slice/crio-42a0ef0202c96a6ae339ea252b41cc19180378ad202e48bfab5bf6e892bbff40 WatchSource:0}: Error finding container 42a0ef0202c96a6ae339ea252b41cc19180378ad202e48bfab5bf6e892bbff40: Status 404 returned error can't find the container with id 42a0ef0202c96a6ae339ea252b41cc19180378ad202e48bfab5bf6e892bbff40 Dec 03 11:05:08 crc kubenswrapper[4702]: W1203 11:05:08.685356 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb0547ad_edd5_4b70_af0f_9f606793e6a9.slice/crio-cea172ce63fbfa4948d9224bc24ad47471ea40a9c9614b4bfddde242cf4e911c WatchSource:0}: Error finding container cea172ce63fbfa4948d9224bc24ad47471ea40a9c9614b4bfddde242cf4e911c: Status 404 returned error can't find the container with id cea172ce63fbfa4948d9224bc24ad47471ea40a9c9614b4bfddde242cf4e911c Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.757511 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.758037 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.258013408 +0000 UTC m=+93.093941872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.831328 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.839473 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-njlsm"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.842826 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs"] Dec 03 11:05:08 crc kubenswrapper[4702]: W1203 11:05:08.849371 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bce3a1f_cd2e_41b9_b768_322f3ce72ed9.slice/crio-8a33cb29b6141890267bbf11029a24ece4a0ced40fd5203d72ab41585b1cff2b WatchSource:0}: Error finding container 8a33cb29b6141890267bbf11029a24ece4a0ced40fd5203d72ab41585b1cff2b: Status 404 returned error can't find the container with id 8a33cb29b6141890267bbf11029a24ece4a0ced40fd5203d72ab41585b1cff2b Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.859099 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.859521 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.359506913 +0000 UTC m=+93.195435377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.864200 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-nnrsp"] Dec 03 11:05:08 crc kubenswrapper[4702]: W1203 11:05:08.882500 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec4c6214_773a_4bcd_ae3e_02a4d74b791a.slice/crio-25322ca89e1f88def2ba7372ad65f9fa4345045ae693798c98ec590865b2e939 WatchSource:0}: Error finding container 25322ca89e1f88def2ba7372ad65f9fa4345045ae693798c98ec590865b2e939: Status 404 returned error can't find the container with id 25322ca89e1f88def2ba7372ad65f9fa4345045ae693798c98ec590865b2e939 Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.914395 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.924338 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.954482 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ql9kq"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.959707 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:08 crc kubenswrapper[4702]: E1203 11:05:08.960240 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.460221438 +0000 UTC m=+93.296149902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.960394 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.971245 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.977173 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.979892 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9"] Dec 03 11:05:08 crc kubenswrapper[4702]: W1203 11:05:08.986484 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3895be1f_db04_45b3_bd8c_cf2ab8c2aa43.slice/crio-596407f77e02d67341bc1bdba0bbdcd55d5f7ce34d51e51ea436882662af044b WatchSource:0}: Error finding container 596407f77e02d67341bc1bdba0bbdcd55d5f7ce34d51e51ea436882662af044b: Status 404 returned error can't find the container with id 596407f77e02d67341bc1bdba0bbdcd55d5f7ce34d51e51ea436882662af044b Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.988740 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-r54j5"] Dec 03 11:05:08 crc kubenswrapper[4702]: I1203 11:05:08.993219 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:08.999968 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.003129 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.004978 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.008831 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ccdtg"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.008905 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-54zxt"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.010933 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-f8vb8"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.012506 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-hndf6"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.019282 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bnkx7"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.070476 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.080884 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.580862555 +0000 UTC m=+93.416791019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: W1203 11:05:09.121341 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28d79fc8_1fa6_4a05_aede_e9d0967dfb3b.slice/crio-d33095936c1c9913c8309fa607f7c02f20f8d3b92c69d2e61bea82040a530ac2 WatchSource:0}: Error finding container d33095936c1c9913c8309fa607f7c02f20f8d3b92c69d2e61bea82040a530ac2: Status 404 returned error can't find the container with id d33095936c1c9913c8309fa607f7c02f20f8d3b92c69d2e61bea82040a530ac2 Dec 03 11:05:09 crc kubenswrapper[4702]: W1203 11:05:09.134656 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77a69d25_2384_466b_b284_e36e979597b4.slice/crio-9f313546a49233d1908e3f6bfa418f5cb8d5cf67b481edfec850145c77e9bd9f WatchSource:0}: Error finding container 9f313546a49233d1908e3f6bfa418f5cb8d5cf67b481edfec850145c77e9bd9f: Status 404 returned error can't find the container with id 9f313546a49233d1908e3f6bfa418f5cb8d5cf67b481edfec850145c77e9bd9f Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.160555 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.162196 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ccdtg" event={"ID":"761a509d-5cb8-4506-901c-614a7d633d39","Type":"ContainerStarted","Data":"81b49c145d916548d1b7a2037d78495ddc381bc2e200abb5a89d121d0f50caf2"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.178196 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:09 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:09 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:09 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.178291 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.182811 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.183090 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.683046175 +0000 UTC m=+93.518974639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.188606 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.184708 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bpfvn"] Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.189484 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.689466801 +0000 UTC m=+93.525395265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.194001 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fppws" event={"ID":"afd002b9-3309-4286-a7cb-29c2f4817feb","Type":"ContainerStarted","Data":"ef724ca692e5414c7b4685abb6398a5e67a246a6399842736231cbaf1b94432a"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.196680 4702 generic.go:334] "Generic (PLEG): container finished" podID="a456460b-47a3-48ef-a98b-4f67709d5939" containerID="561d8b2212d7e8374195bf05be87ee72ab6b57a15ab904bc5bd857b6048d980d" exitCode=0 Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.196783 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" event={"ID":"a456460b-47a3-48ef-a98b-4f67709d5939","Type":"ContainerDied","Data":"561d8b2212d7e8374195bf05be87ee72ab6b57a15ab904bc5bd857b6048d980d"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.196822 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" event={"ID":"a456460b-47a3-48ef-a98b-4f67709d5939","Type":"ContainerStarted","Data":"03b2338581b8411dabf00e25adfe2ea5bc9167179c5c6b638da9cd71a0a81386"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.199048 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.201665 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" event={"ID":"c31210aa-a99c-4f88-a496-9f61835b4445","Type":"ContainerStarted","Data":"92a6bcf373355dd67a92f654c97f022396cc85ebc862027f7f1fa88b830388ed"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.218117 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" event={"ID":"0276c6fb-ba7a-459f-9610-34a03593669b","Type":"ContainerStarted","Data":"26637f1a5e04fce917256c54e533082bf43369dab619f695bbf0507ce2c01dac"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.229777 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-pscld"] Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.235773 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" event={"ID":"3de04148-0009-427b-8055-a1c5dadb8274","Type":"ContainerStarted","Data":"35d0d9f0f157e4d997633a9b26e71f857f64df6cb52d44799650ed805c82fc27"} Dec 03 11:05:09 crc kubenswrapper[4702]: W1203 11:05:09.235943 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a56343e_6342_4982_9ff1_8bae70d5771a.slice/crio-1125ca3acfaf2b220d43b42c7611043d76ee4c0f254ac157b1e317d5e08fedd9 WatchSource:0}: Error finding container 1125ca3acfaf2b220d43b42c7611043d76ee4c0f254ac157b1e317d5e08fedd9: Status 404 returned error can't find the container with id 1125ca3acfaf2b220d43b42c7611043d76ee4c0f254ac157b1e317d5e08fedd9 Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.245826 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" event={"ID":"db0547ad-edd5-4b70-af0f-9f606793e6a9","Type":"ContainerStarted","Data":"cea172ce63fbfa4948d9224bc24ad47471ea40a9c9614b4bfddde242cf4e911c"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.261151 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" event={"ID":"fc476661-200c-4feb-8b45-fe203009356a","Type":"ContainerStarted","Data":"05dac3be1909f96752ccef90025b20d538d80b0e011a15b70367e2a8463769f2"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.261340 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" event={"ID":"fc476661-200c-4feb-8b45-fe203009356a","Type":"ContainerStarted","Data":"677c65da17956af5a528e4c158485745d148f49a104fd894ce05babf7b01512b"} Dec 03 11:05:09 crc kubenswrapper[4702]: W1203 11:05:09.266254 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod042c7f5b_da64_4f42_a2b2_58d04b73c12a.slice/crio-e50717ddf588bd2baeed7c35e068f9c0b4d6e7262410e2ba54118c00a5597abe WatchSource:0}: Error finding container e50717ddf588bd2baeed7c35e068f9c0b4d6e7262410e2ba54118c00a5597abe: Status 404 returned error can't find the container with id e50717ddf588bd2baeed7c35e068f9c0b4d6e7262410e2ba54118c00a5597abe Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.266289 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" event={"ID":"e85fa6d0-61c0-4d62-adeb-e2402e597d87","Type":"ContainerStarted","Data":"d33cc835f313db31d97a33803a17238a6645d5d5f10e73863c2afb394a1877ee"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.274379 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" event={"ID":"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac","Type":"ContainerStarted","Data":"92a4c34d8167b72f696e7e0c2457a87c45e8b2b8bd8e011546f92f1f3489ec3b"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.276128 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" podStartSLOduration=70.276114869 podStartE2EDuration="1m10.276114869s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:09.274421367 +0000 UTC m=+93.110349831" watchObservedRunningTime="2025-12-03 11:05:09.276114869 +0000 UTC m=+93.112043333" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.279389 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" event={"ID":"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0","Type":"ContainerStarted","Data":"8dcc4d14150891a12b243222ebc2cec05314a2b8104a70e594ac7fdd5d461952"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.290330 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.290500 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.790475496 +0000 UTC m=+93.626403960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.309244 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.292725 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" event={"ID":"d5cab780-360c-48fa-9c88-36e15f37da47","Type":"ContainerStarted","Data":"9624cb4f04fc49b4395ea8a510d4e3d3a643a58f3d56d20595f6c54db675b037"} Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.311805 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.811788198 +0000 UTC m=+93.647716652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.306103 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-nmmt2" podStartSLOduration=70.306062558 podStartE2EDuration="1m10.306062558s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:09.302483127 +0000 UTC m=+93.138411601" watchObservedRunningTime="2025-12-03 11:05:09.306062558 +0000 UTC m=+93.141991042" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.340088 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" event={"ID":"b8e73047-6376-4bd9-8ec8-5966f8786e5d","Type":"ContainerStarted","Data":"4b006412570ab1e3a3b4c3d92e5a2e12b074217d3aad6f3b8b1db3a7c7ec3e00"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.340162 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" event={"ID":"b8e73047-6376-4bd9-8ec8-5966f8786e5d","Type":"ContainerStarted","Data":"6f7e78273b849778c55736ae1f61c737668756b7dffc8eb8ea52b2758424d836"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.367182 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-gzc7n" podStartSLOduration=70.36716183 podStartE2EDuration="1m10.36716183s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:09.366319899 +0000 UTC m=+93.202248363" watchObservedRunningTime="2025-12-03 11:05:09.36716183 +0000 UTC m=+93.203090284" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.370989 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" event={"ID":"d7be37fc-7374-46ae-a0e3-1cafab3430ec","Type":"ContainerStarted","Data":"4afb73329ce0afd0794963caf8e56847938fe1c7935991bc244f18e748665d5a"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.393214 4702 generic.go:334] "Generic (PLEG): container finished" podID="adc43b14-86cb-4ff5-b7fb-a9ba32cde631" containerID="711a5e7d4b614a3e48df8c4a32b06534c995fd317db949ab081773d3626af4e0" exitCode=0 Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.393322 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" event={"ID":"adc43b14-86cb-4ff5-b7fb-a9ba32cde631","Type":"ContainerDied","Data":"711a5e7d4b614a3e48df8c4a32b06534c995fd317db949ab081773d3626af4e0"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.393355 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" event={"ID":"adc43b14-86cb-4ff5-b7fb-a9ba32cde631","Type":"ContainerStarted","Data":"10068ad191cc786a53d0baa4af2d170547c35f9693d907a093af66fce2da6593"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.396698 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" event={"ID":"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1","Type":"ContainerStarted","Data":"7fbd20a467293a2fb302b138dbcecaaf3433a9d09dffe8ad04200aaaf07bfd86"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.410600 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.410813 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerStarted","Data":"63104b8f3b3fdf59d8382f4b591c8ee932c380c031d096b5ce062393376195e8"} Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.414797 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:09.914739316 +0000 UTC m=+93.750667780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.424127 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" event={"ID":"ec4c6214-773a-4bcd-ae3e-02a4d74b791a","Type":"ContainerStarted","Data":"25322ca89e1f88def2ba7372ad65f9fa4345045ae693798c98ec590865b2e939"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.441294 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" podStartSLOduration=70.441261419 podStartE2EDuration="1m10.441261419s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:09.39549276 +0000 UTC m=+93.231421244" watchObservedRunningTime="2025-12-03 11:05:09.441261419 +0000 UTC m=+93.277189883" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.442339 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" event={"ID":"ee7cd19b-47c8-464d-9156-69d796be8866","Type":"ContainerStarted","Data":"0d83a9d7e34b3476297d01c751b5526e3bd94b67132321293fd4bdb4c7f8ff29"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.472164 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" event={"ID":"5bad766d-e524-4670-b353-56e92df2f744","Type":"ContainerStarted","Data":"d3538527ec1c0eff1cf367faea77ee3efdefc14633dab42c7a824a6b5e81c792"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.472848 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" event={"ID":"5bad766d-e524-4670-b353-56e92df2f744","Type":"ContainerStarted","Data":"42a0ef0202c96a6ae339ea252b41cc19180378ad202e48bfab5bf6e892bbff40"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.488418 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" event={"ID":"37b44be4-8fc9-4b74-9339-8bb658d866fc","Type":"ContainerStarted","Data":"ea850e4c9365c0e0a5885f3ae8fcc659b19c6604073e75adf45d577da415b07c"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.504861 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" event={"ID":"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b","Type":"ContainerStarted","Data":"d33095936c1c9913c8309fa607f7c02f20f8d3b92c69d2e61bea82040a530ac2"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.513058 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.513631 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" event={"ID":"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43","Type":"ContainerStarted","Data":"596407f77e02d67341bc1bdba0bbdcd55d5f7ce34d51e51ea436882662af044b"} Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.516290 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.016258701 +0000 UTC m=+93.852187265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.529801 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" event={"ID":"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c","Type":"ContainerStarted","Data":"5e255119d9def47deecef0e5c4ab0cae07e92a62abde61bb9f6f13d376a89f78"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.554560 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" podStartSLOduration=71.554537666 podStartE2EDuration="1m11.554537666s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:09.516189649 +0000 UTC m=+93.352118143" watchObservedRunningTime="2025-12-03 11:05:09.554537666 +0000 UTC m=+93.390466130" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.555240 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" podStartSLOduration=70.555231052 podStartE2EDuration="1m10.555231052s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:09.551439452 +0000 UTC m=+93.387367926" watchObservedRunningTime="2025-12-03 11:05:09.555231052 +0000 UTC m=+93.391159526" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.593949 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" event={"ID":"0bce3a1f-cd2e-41b9-b768-322f3ce72ed9","Type":"ContainerStarted","Data":"8a33cb29b6141890267bbf11029a24ece4a0ced40fd5203d72ab41585b1cff2b"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.600132 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" event={"ID":"c6edf33c-728d-482f-ad5c-ceb85dae3b75","Type":"ContainerStarted","Data":"96bdcca6f5f34b391eac072501822d22b21f28fe9c3e9ceec102327a350c5f80"} Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.600639 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.617597 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.628527 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" podStartSLOduration=70.62850366 podStartE2EDuration="1m10.62850366s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:09.617508257 +0000 UTC m=+93.453436721" watchObservedRunningTime="2025-12-03 11:05:09.62850366 +0000 UTC m=+93.464432124" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.628689 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.628817 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.642137 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.647310 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.147287559 +0000 UTC m=+93.983216023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.660935 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.661267 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.747627 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.748136 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.24812486 +0000 UTC m=+94.084053324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.808738 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podStartSLOduration=71.808710403 podStartE2EDuration="1m11.808710403s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:09.778056468 +0000 UTC m=+93.613984932" watchObservedRunningTime="2025-12-03 11:05:09.808710403 +0000 UTC m=+93.644638867" Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.850139 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.851411 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.351387959 +0000 UTC m=+94.187316433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:09 crc kubenswrapper[4702]: I1203 11:05:09.988680 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:09 crc kubenswrapper[4702]: E1203 11:05:09.989140 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.489127054 +0000 UTC m=+94.325055518 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.092994 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.093399 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.593377999 +0000 UTC m=+94.429306463 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.168300 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.177482 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:10 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:10 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:10 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.177749 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.194861 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.195267 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.695251048 +0000 UTC m=+94.531179502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.295729 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.296150 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.796129799 +0000 UTC m=+94.632058263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.397524 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.398111 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:10.89807053 +0000 UTC m=+94.733999004 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.500079 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.500373 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.000342053 +0000 UTC m=+94.836270507 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.500441 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.500810 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.00079702 +0000 UTC m=+94.836725484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.604964 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.605815 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.105797983 +0000 UTC m=+94.941726447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.678177 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-b8s4f" event={"ID":"db0547ad-edd5-4b70-af0f-9f606793e6a9","Type":"ContainerStarted","Data":"aa37ba99b96940e19ebfb0f8ee2d24d82c21c3d2659c6e15f8bf65acd23cf282"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.680651 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" event={"ID":"28d79fc8-1fa6-4a05-aede-e9d0967dfb3b","Type":"ContainerStarted","Data":"3ec7533121fc22992aff0403fee1fce0b726be54ca249e23dc22cc133cc1301c"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.685018 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-zthg4" event={"ID":"0bce3a1f-cd2e-41b9-b768-322f3ce72ed9","Type":"ContainerStarted","Data":"e4922b568d6888cd4ec08ea3301e4be6f187be5231706e3cce8b7a9046cf308a"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.686740 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" event={"ID":"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac","Type":"ContainerStarted","Data":"8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.687451 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.689009 4702 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ql9kq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.689044 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.694923 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-r54j5" podStartSLOduration=71.694907423 podStartE2EDuration="1m11.694907423s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:10.692903199 +0000 UTC m=+94.528831663" watchObservedRunningTime="2025-12-03 11:05:10.694907423 +0000 UTC m=+94.530835887" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.697395 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerStarted","Data":"e50717ddf588bd2baeed7c35e068f9c0b4d6e7262410e2ba54118c00a5597abe"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.707514 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.708024 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.208002663 +0000 UTC m=+95.043931187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.718262 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ccdtg" event={"ID":"761a509d-5cb8-4506-901c-614a7d633d39","Type":"ContainerStarted","Data":"f6c837c7d8947d2404a52599d5b716ddf7436fae795ff90907d36691abed95d2"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.720687 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" event={"ID":"af4eb68c-2986-486a-9b9a-4905bd264322","Type":"ContainerStarted","Data":"70e084c53972e9e457cbc3d1c73e2611e9ca50bc6f527afea1f56fcafa1dcca9"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.720713 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" event={"ID":"af4eb68c-2986-486a-9b9a-4905bd264322","Type":"ContainerStarted","Data":"7746d9d20ea8c06102b1b867e7f5ad704af9961a6ba7c1af37c99e6065a2ccfb"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.723554 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" podStartSLOduration=71.723539403 podStartE2EDuration="1m11.723539403s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:10.721774689 +0000 UTC m=+94.557703153" watchObservedRunningTime="2025-12-03 11:05:10.723539403 +0000 UTC m=+94.559467867" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.728836 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" event={"ID":"2535e580-a150-4c92-912f-142f212faebd","Type":"ContainerStarted","Data":"2f88c320c628832afc70e53d350f9ea4b8072506d6bf789d98d4bdbd5e548db1"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.728870 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" event={"ID":"2535e580-a150-4c92-912f-142f212faebd","Type":"ContainerStarted","Data":"64ea77e07f7785e7ac1bce6fbadd645ad847161a2818cedab6c63449afad2cd5"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.732155 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" event={"ID":"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1","Type":"ContainerStarted","Data":"1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.732869 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.733941 4702 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-njlsm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.733974 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" podUID="77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.734103 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" event={"ID":"0276c6fb-ba7a-459f-9610-34a03593669b","Type":"ContainerStarted","Data":"9a3f99c658871b2d5b7276c4079ae2bd142c2137caf9aba99bf227ed580bf8ea"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.734553 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.735171 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.735190 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.735631 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" event={"ID":"d5cab780-360c-48fa-9c88-36e15f37da47","Type":"ContainerStarted","Data":"37c3c7bca255a1423e527ba075b3c59a68527baebe544db3d8655d8a47e3d600"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.737566 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" event={"ID":"a456460b-47a3-48ef-a98b-4f67709d5939","Type":"ContainerStarted","Data":"56666e8a78a82b8a80949b132ba3edf357fbbc19b68d6b6c216a6290e3608589"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.738032 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.739358 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" event={"ID":"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43","Type":"ContainerStarted","Data":"b13e93cc06e972149bc9f92319fb6d6d474c7a18829e3e1ce0afbfd02122c0c2"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.740038 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.741047 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.741077 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.741498 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bpfvn" event={"ID":"7e9a6a2d-0a6d-42a1-a44b-a82ee572b995","Type":"ContainerStarted","Data":"c038e3e270c06a9011fca99fc989e6019eb5b897d3de8539e42281783c8cbfbc"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.741529 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bpfvn" event={"ID":"7e9a6a2d-0a6d-42a1-a44b-a82ee572b995","Type":"ContainerStarted","Data":"289b3e0f9f9b64e313ab4221ce06af63d1e2476e1c075b47f9fd59f270180dbf"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.746540 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f8vb8" event={"ID":"47af44a8-4e95-4edf-81b4-efbfdab9447a","Type":"ContainerStarted","Data":"66a706c2abb3d8ddf31ca869fe9c0adff5fcf79ef70b2488ee13a1adc11f7456"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.772858 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" event={"ID":"c6edf33c-728d-482f-ad5c-ceb85dae3b75","Type":"ContainerStarted","Data":"a7bc8e70a7c69f53a453f3a91be933b6d9c561768bed1ec433370f217d06d90f"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.773166 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ccdtg" podStartSLOduration=72.773143264 podStartE2EDuration="1m12.773143264s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:10.770544378 +0000 UTC m=+94.606472842" watchObservedRunningTime="2025-12-03 11:05:10.773143264 +0000 UTC m=+94.609071728" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.782200 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.782474 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.792879 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pg9xx" podStartSLOduration=72.792856107 podStartE2EDuration="1m12.792856107s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:10.79129374 +0000 UTC m=+94.627222214" watchObservedRunningTime="2025-12-03 11:05:10.792856107 +0000 UTC m=+94.628784571" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.797152 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m6tdv" event={"ID":"37b44be4-8fc9-4b74-9339-8bb658d866fc","Type":"ContainerStarted","Data":"772521470a3971158b6066c79b7506cc25526ab4a2bc747a7046c50253502d94"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.805400 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" event={"ID":"36516401-b0b6-42c4-b444-84ee4e336839","Type":"ContainerStarted","Data":"c69851ed40ad6a11592f5d9fb7f293d97135089eba1e859e0be74fcfddc5d604"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.808338 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.810502 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.310481844 +0000 UTC m=+95.146410308 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.818647 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" event={"ID":"77a69d25-2384-466b-b284-e36e979597b4","Type":"ContainerStarted","Data":"b1abdb7d1dde37410aaaab35ec091adbaa25c05e540afd9716f5f7c13ddcd219"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.818696 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" event={"ID":"77a69d25-2384-466b-b284-e36e979597b4","Type":"ContainerStarted","Data":"9f313546a49233d1908e3f6bfa418f5cb8d5cf67b481edfec850145c77e9bd9f"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.840798 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-pp87r" podStartSLOduration=71.840780536 podStartE2EDuration="1m11.840780536s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:10.840027758 +0000 UTC m=+94.675956232" watchObservedRunningTime="2025-12-03 11:05:10.840780536 +0000 UTC m=+94.676709000" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.875293 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" event={"ID":"ec4c6214-773a-4bcd-ae3e-02a4d74b791a","Type":"ContainerStarted","Data":"c3726e9362366f614b112705c374e7989549546621612dbd33a3f27315809dcc"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.883162 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-s6wcg" podStartSLOduration=71.88314104 podStartE2EDuration="1m11.88314104s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:10.880660269 +0000 UTC m=+94.716588743" watchObservedRunningTime="2025-12-03 11:05:10.88314104 +0000 UTC m=+94.719069504" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.892927 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" event={"ID":"e85fa6d0-61c0-4d62-adeb-e2402e597d87","Type":"ContainerStarted","Data":"bfea375719cac405f9987ad4a3206ad06bab3caaf9425ec81115c11d2a6cc107"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.895466 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" event={"ID":"1a56343e-6342-4982-9ff1-8bae70d5771a","Type":"ContainerStarted","Data":"1125ca3acfaf2b220d43b42c7611043d76ee4c0f254ac157b1e317d5e08fedd9"} Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.918770 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.919797 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:10 crc kubenswrapper[4702]: E1203 11:05:10.923106 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.423089206 +0000 UTC m=+95.259017730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:10 crc kubenswrapper[4702]: I1203 11:05:10.969773 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podStartSLOduration=71.969739938 podStartE2EDuration="1m11.969739938s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:10.953040175 +0000 UTC m=+94.788968639" watchObservedRunningTime="2025-12-03 11:05:10.969739938 +0000 UTC m=+94.805668402" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.026694 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" podStartSLOduration=72.026666597 podStartE2EDuration="1m12.026666597s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:11.01448706 +0000 UTC m=+94.850415524" watchObservedRunningTime="2025-12-03 11:05:11.026666597 +0000 UTC m=+94.862595061" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.035533 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.037012 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.536979986 +0000 UTC m=+95.372908540 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.080375 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podStartSLOduration=72.080358897 podStartE2EDuration="1m12.080358897s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:11.080009105 +0000 UTC m=+94.915937569" watchObservedRunningTime="2025-12-03 11:05:11.080358897 +0000 UTC m=+94.916287361" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.124541 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bpfvn" podStartSLOduration=9.124505607 podStartE2EDuration="9.124505607s" podCreationTimestamp="2025-12-03 11:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:11.124251298 +0000 UTC m=+94.960179762" watchObservedRunningTime="2025-12-03 11:05:11.124505607 +0000 UTC m=+94.960434071" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.138419 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.138880 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.638863674 +0000 UTC m=+95.474792138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.228094 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:11 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:11 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:11 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.228174 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.245668 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.246259 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.746067218 +0000 UTC m=+95.581995682 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.258782 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podStartSLOduration=73.258740593 podStartE2EDuration="1m13.258740593s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:11.210169571 +0000 UTC m=+95.046098035" watchObservedRunningTime="2025-12-03 11:05:11.258740593 +0000 UTC m=+95.094669057" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.327252 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" podStartSLOduration=72.327227576 podStartE2EDuration="1m12.327227576s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:11.260700035 +0000 UTC m=+95.096628519" watchObservedRunningTime="2025-12-03 11:05:11.327227576 +0000 UTC m=+95.163156040" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.349737 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.350153 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.850138657 +0000 UTC m=+95.686067131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.365552 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" podStartSLOduration=73.365532652 podStartE2EDuration="1m13.365532652s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:11.364287236 +0000 UTC m=+95.200215700" watchObservedRunningTime="2025-12-03 11:05:11.365532652 +0000 UTC m=+95.201461116" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.456535 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.456827 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.956730799 +0000 UTC m=+95.792659263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.457078 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.457449 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:11.957438925 +0000 UTC m=+95.793367389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.558389 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.558608 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.058570786 +0000 UTC m=+95.894499260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.558971 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.559363 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.059352214 +0000 UTC m=+95.895280688 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.660317 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.661198 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.161176411 +0000 UTC m=+95.997104875 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.762077 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.762380 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.262366354 +0000 UTC m=+96.098294818 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.883520 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.884158 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.384129262 +0000 UTC m=+96.220057726 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.919707 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" event={"ID":"c31210aa-a99c-4f88-a496-9f61835b4445","Type":"ContainerStarted","Data":"967ca6c7898d1edf68c5db96f9afda0f70d8b6ce374fbe8cd7625c714c6a8061"} Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.919826 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" event={"ID":"c31210aa-a99c-4f88-a496-9f61835b4445","Type":"ContainerStarted","Data":"8de409b596e9af8e12e035873e5e93251f7a208dcd148265eaa0c81d8fa1dbb5"} Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.932431 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" event={"ID":"36516401-b0b6-42c4-b444-84ee4e336839","Type":"ContainerStarted","Data":"55213341b63cc94ebd24c43c0c71453265e757eb5b631fd8a0849e3967f71798"} Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.932482 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" event={"ID":"36516401-b0b6-42c4-b444-84ee4e336839","Type":"ContainerStarted","Data":"7d001593614fff2d5aaaf72d1809579508924ff465403916f6208908338f3853"} Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.941896 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" event={"ID":"ee7cd19b-47c8-464d-9156-69d796be8866","Type":"ContainerStarted","Data":"4381f60c3c68bdd68f2d106a97347967b1ab15646ecb4cd6c2101fa8d159592e"} Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.953347 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-x92bs" event={"ID":"ec4c6214-773a-4bcd-ae3e-02a4d74b791a","Type":"ContainerStarted","Data":"76c6ed03193acfcc914baa06f51b554c9869c6ce2d5a664fa6ce469765e6bc1d"} Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.960845 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x4pq9" podStartSLOduration=72.960820636 podStartE2EDuration="1m12.960820636s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:11.957796215 +0000 UTC m=+95.793724689" watchObservedRunningTime="2025-12-03 11:05:11.960820636 +0000 UTC m=+95.796749121" Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.985830 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:11 crc kubenswrapper[4702]: E1203 11:05:11.986330 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.486309542 +0000 UTC m=+96.322238076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:11 crc kubenswrapper[4702]: I1203 11:05:11.987162 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" event={"ID":"e85fa6d0-61c0-4d62-adeb-e2402e597d87","Type":"ContainerStarted","Data":"745aa1257fa8f7ea302629570071a5bc934ef9f604b4b438f7a397bbad52956c"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.010260 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-54zxt" podStartSLOduration=73.010226609 podStartE2EDuration="1m13.010226609s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.007685696 +0000 UTC m=+95.843614160" watchObservedRunningTime="2025-12-03 11:05:12.010226609 +0000 UTC m=+95.846155073" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.015981 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerStarted","Data":"7d68b0986741379ab8deb19834da932ca36898d987fd53602fe7466e084f46aa"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.043099 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" event={"ID":"1a56343e-6342-4982-9ff1-8bae70d5771a","Type":"ContainerStarted","Data":"5aff0471df96d9fc04f8d6f0aac88ef1667325ac26d3690a166b71175e5da7f0"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.043167 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" event={"ID":"1a56343e-6342-4982-9ff1-8bae70d5771a","Type":"ContainerStarted","Data":"7ecdfb6d3e988e406fe6306e95f6a1e82b5300a207bbb5012a8390497bb7d77b"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.050253 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-bnkx7" podStartSLOduration=73.050215527 podStartE2EDuration="1m13.050215527s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.048431431 +0000 UTC m=+95.884359895" watchObservedRunningTime="2025-12-03 11:05:12.050215527 +0000 UTC m=+95.886143991" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.061681 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" event={"ID":"ebf7fe4a-d38b-4e14-b395-6b2de24c43d0","Type":"ContainerStarted","Data":"ba708ec1362046b3c922a549dc290f14b37f1145a3b5475af60379d67cc31646"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.079529 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" event={"ID":"3de04148-0009-427b-8055-a1c5dadb8274","Type":"ContainerStarted","Data":"1fa17a686806b34438984f5b5ca80dc2a3bcdaf1f0c62c12a6ee5d914c661e83"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.079592 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.089651 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.089718 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.090834 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.091915 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.591900506 +0000 UTC m=+96.427828970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.102269 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-rh69s" podStartSLOduration=73.102246996 podStartE2EDuration="1m13.102246996s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.102229565 +0000 UTC m=+95.938158039" watchObservedRunningTime="2025-12-03 11:05:12.102246996 +0000 UTC m=+95.938175460" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.104516 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f8vb8" event={"ID":"47af44a8-4e95-4edf-81b4-efbfdab9447a","Type":"ContainerStarted","Data":"aa911b88d358148d47e7d427d123c10e29ccfc8e666d9ad93759150ac3604031"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.104628 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-f8vb8" event={"ID":"47af44a8-4e95-4edf-81b4-efbfdab9447a","Type":"ContainerStarted","Data":"61bfbe765b33bf70896a48a3361cc515eec4484123e372d77dd53dc0a1a1e2a3"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.105869 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.138707 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" event={"ID":"adc43b14-86cb-4ff5-b7fb-a9ba32cde631","Type":"ContainerStarted","Data":"57ebfde44a753e575373feff9d847707a871dff6eaba2404edb2afb2868a749e"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.139066 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" event={"ID":"adc43b14-86cb-4ff5-b7fb-a9ba32cde631","Type":"ContainerStarted","Data":"afec8a49166b0d4ce163609679cea58b3ffb867bcf73b1153d88cf7642eef228"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.152833 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-rk4kf" podStartSLOduration=73.152813202 podStartE2EDuration="1m13.152813202s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.149445968 +0000 UTC m=+95.985374472" watchObservedRunningTime="2025-12-03 11:05:12.152813202 +0000 UTC m=+95.988741666" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.161161 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerStarted","Data":"b19709c46ee9bdb2df0ae0064f1db24583ec1036f7312805bc2da46e7287368d"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.162408 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.183098 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.183174 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.190912 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:12 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:12 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:12 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.190988 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.196826 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.199204 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.699188233 +0000 UTC m=+96.535116697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.206867 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" event={"ID":"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c","Type":"ContainerStarted","Data":"55ba24bdc5dfda6338044e5b6537e2daf909640038933dcfe6fc3cda39daae97"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.206960 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" event={"ID":"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c","Type":"ContainerStarted","Data":"1626abe3ed15a198a4e566a115de6e6d881278876d92ba929f1591bfe5f92455"} Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.207797 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.208266 4702 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ql9kq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" start-of-body= Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.208318 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.41:8080/healthz\": dial tcp 10.217.0.41:8080: connect: connection refused" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.233176 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.233349 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.250805 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podStartSLOduration=73.250789797 podStartE2EDuration="1m13.250789797s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.25034216 +0000 UTC m=+96.086270624" watchObservedRunningTime="2025-12-03 11:05:12.250789797 +0000 UTC m=+96.086718261" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.251709 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" podStartSLOduration=74.25170329 podStartE2EDuration="1m14.25170329s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.205168783 +0000 UTC m=+96.041097247" watchObservedRunningTime="2025-12-03 11:05:12.25170329 +0000 UTC m=+96.087631754" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.265688 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.298838 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.300954 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.800929077 +0000 UTC m=+96.636857541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.321909 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dw786" podStartSLOduration=73.321885896 podStartE2EDuration="1m13.321885896s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.320522276 +0000 UTC m=+96.156450740" watchObservedRunningTime="2025-12-03 11:05:12.321885896 +0000 UTC m=+96.157814360" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.466573 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.467086 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:12.967072323 +0000 UTC m=+96.803000787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.498755 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-f8vb8" podStartSLOduration=10.498733515 podStartE2EDuration="10.498733515s" podCreationTimestamp="2025-12-03 11:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.389060241 +0000 UTC m=+96.224988695" watchObservedRunningTime="2025-12-03 11:05:12.498733515 +0000 UTC m=+96.334661979" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.602157 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.602654 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.102633378 +0000 UTC m=+96.938561842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.610329 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-hndf6" podStartSLOduration=74.610306479 podStartE2EDuration="1m14.610306479s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.499249964 +0000 UTC m=+96.335178428" watchObservedRunningTime="2025-12-03 11:05:12.610306479 +0000 UTC m=+96.446234943" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.643392 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podStartSLOduration=73.643371403 podStartE2EDuration="1m13.643371403s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:12.614540285 +0000 UTC m=+96.450468749" watchObservedRunningTime="2025-12-03 11:05:12.643371403 +0000 UTC m=+96.479299867" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.662596 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.704553 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.705085 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.205071417 +0000 UTC m=+97.040999881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.805789 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.805922 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.305892486 +0000 UTC m=+97.141820950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.806223 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.806591 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.306583191 +0000 UTC m=+97.142511655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.890993 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jsndz"] Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.892638 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.894932 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.907376 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.907574 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.407545126 +0000 UTC m=+97.243473590 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.907657 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbhmz\" (UniqueName: \"kubernetes.io/projected/aefd671c-4583-4057-aebd-1c8c7931771f-kube-api-access-hbhmz\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.907716 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.907825 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-catalog-content\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.907864 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-utilities\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:12 crc kubenswrapper[4702]: E1203 11:05:12.908150 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.408140188 +0000 UTC m=+97.244068662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:12 crc kubenswrapper[4702]: I1203 11:05:12.913424 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jsndz"] Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.009284 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.009482 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.509452486 +0000 UTC m=+97.345380950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.009555 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbhmz\" (UniqueName: \"kubernetes.io/projected/aefd671c-4583-4057-aebd-1c8c7931771f-kube-api-access-hbhmz\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.009601 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.009670 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-catalog-content\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.009704 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-utilities\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.010215 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-utilities\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.010316 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.510294207 +0000 UTC m=+97.346222731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.010957 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-catalog-content\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.073254 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbhmz\" (UniqueName: \"kubernetes.io/projected/aefd671c-4583-4057-aebd-1c8c7931771f-kube-api-access-hbhmz\") pod \"community-operators-jsndz\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.110718 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.111335 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.611296693 +0000 UTC m=+97.447225157 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.123345 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wv7dh"] Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.124712 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.129451 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.175947 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:13 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:13 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:13 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.176022 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.205501 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wv7dh"] Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.213961 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.214025 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-utilities\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.214088 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g8s5\" (UniqueName: \"kubernetes.io/projected/085dd40d-8d1f-40ce-903b-fbed55010a29-kube-api-access-9g8s5\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.214120 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-catalog-content\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.214270 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.215000 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.714984628 +0000 UTC m=+97.550913092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.245720 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerStarted","Data":"bf05d3103458340d9ace8aba15fb747ff198897131e9f826b0af71740fcf4612"} Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.249280 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.249353 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.298538 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.316670 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.317373 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-catalog-content\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.318101 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-utilities\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.318341 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g8s5\" (UniqueName: \"kubernetes.io/projected/085dd40d-8d1f-40ce-903b-fbed55010a29-kube-api-access-9g8s5\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.318699 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.818671483 +0000 UTC m=+97.654599947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.320190 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-catalog-content\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.322235 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-utilities\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.345668 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mc4kd"] Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.346697 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.414032 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g8s5\" (UniqueName: \"kubernetes.io/projected/085dd40d-8d1f-40ce-903b-fbed55010a29-kube-api-access-9g8s5\") pod \"certified-operators-wv7dh\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.422509 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-catalog-content\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.422564 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-utilities\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.422614 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.422685 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2swn\" (UniqueName: \"kubernetes.io/projected/68547a12-2568-4a7a-a3e2-fde07134e6ee-kube-api-access-m2swn\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.423111 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:13.923094505 +0000 UTC m=+97.759022969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.458984 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc4kd"] Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.474127 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.524392 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.524695 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2swn\" (UniqueName: \"kubernetes.io/projected/68547a12-2568-4a7a-a3e2-fde07134e6ee-kube-api-access-m2swn\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.524773 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-catalog-content\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.524811 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-utilities\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.525367 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-utilities\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.525470 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.025449771 +0000 UTC m=+97.861378235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.526121 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-catalog-content\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.558405 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvlz6"] Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.559428 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.589390 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2swn\" (UniqueName: \"kubernetes.io/projected/68547a12-2568-4a7a-a3e2-fde07134e6ee-kube-api-access-m2swn\") pod \"community-operators-mc4kd\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.604649 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvlz6"] Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.626711 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-utilities\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.626791 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.626822 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-catalog-content\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.626867 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvxt9\" (UniqueName: \"kubernetes.io/projected/0447b485-fa0f-470b-abb6-da0f406d0f7f-kube-api-access-fvxt9\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.627320 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.127306188 +0000 UTC m=+97.963234652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.704727 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.739418 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.739730 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-utilities\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.739779 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-catalog-content\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.739830 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvxt9\" (UniqueName: \"kubernetes.io/projected/0447b485-fa0f-470b-abb6-da0f406d0f7f-kube-api-access-fvxt9\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.740221 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.240201451 +0000 UTC m=+98.076129915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.740684 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-utilities\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.740976 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-catalog-content\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.802870 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvxt9\" (UniqueName: \"kubernetes.io/projected/0447b485-fa0f-470b-abb6-da0f406d0f7f-kube-api-access-fvxt9\") pod \"certified-operators-rvlz6\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.841129 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.841496 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.341480077 +0000 UTC m=+98.177408541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.881602 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:05:13 crc kubenswrapper[4702]: I1203 11:05:13.959440 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:13 crc kubenswrapper[4702]: E1203 11:05:13.959884 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.459863572 +0000 UTC m=+98.295792036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.072950 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.073337 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.573324255 +0000 UTC m=+98.409252719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.172572 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:14 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:14 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:14 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.172657 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.174950 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.175411 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.6753906 +0000 UTC m=+98.511319064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.250876 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.251309 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.253220 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.253296 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.276525 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jsndz"] Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.277071 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.277631 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.777589461 +0000 UTC m=+98.613517935 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.311606 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerStarted","Data":"0ee4639a5e33a1042b34100e0021f52e2601e1fbb8df4d355713d383dec09ec9"} Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.313974 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.314010 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.322035 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.385350 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.386798 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.886772577 +0000 UTC m=+98.722701091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.487091 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.487427 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:14.98741128 +0000 UTC m=+98.823339744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.556411 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wv7dh"] Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.587479 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.588709 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:15.088671636 +0000 UTC m=+98.924600100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.618871 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.619363 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:15.119349982 +0000 UTC m=+98.955278446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.619480 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.641287 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.641419 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.650870 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.651167 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.654923 4702 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.666087 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mc4kd"] Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.722202 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.722599 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.722676 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.722827 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:15.222809668 +0000 UTC m=+99.058738132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.828331 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.828461 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.828513 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.828580 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.829312 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:15.329283745 +0000 UTC m=+99.165212209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.912436 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.939148 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:14 crc kubenswrapper[4702]: E1203 11:05:14.940011 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 11:05:15.439988418 +0000 UTC m=+99.275916882 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.964597 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:14 crc kubenswrapper[4702]: I1203 11:05:14.964748 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.046937 4702 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T11:05:14.654951278Z","Handler":null,"Name":""} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.049636 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:15 crc kubenswrapper[4702]: E1203 11:05:15.050111 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 11:05:15.550097468 +0000 UTC m=+99.386025932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-lnnbr" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.062054 4702 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.062124 4702 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.074909 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.130400 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.130466 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.130582 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.130626 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.130822 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.130840 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.148648 4702 patch_prober.go:28] interesting pod/console-f9d7485db-ccdtg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.148722 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ccdtg" podUID="761a509d-5cb8-4506-901c-614a7d633d39" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.151859 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.154532 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tgvqz"] Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.155661 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.167885 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.169918 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.175468 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:15 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:15 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:15 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.175529 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.213952 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.256020 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llwlk\" (UniqueName: \"kubernetes.io/projected/ab09df47-f81e-4b91-aee1-89e919c149ee-kube-api-access-llwlk\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.256208 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-utilities\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.256273 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-catalog-content\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.256314 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.263999 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvlz6"] Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.289631 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgvqz"] Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.298988 4702 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.299033 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.364478 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-utilities\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.364838 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-catalog-content\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.364935 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llwlk\" (UniqueName: \"kubernetes.io/projected/ab09df47-f81e-4b91-aee1-89e919c149ee-kube-api-access-llwlk\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.365711 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-utilities\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.365951 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-catalog-content\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.366285 4702 generic.go:334] "Generic (PLEG): container finished" podID="aefd671c-4583-4057-aebd-1c8c7931771f" containerID="2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9" exitCode=0 Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.367182 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsndz" event={"ID":"aefd671c-4583-4057-aebd-1c8c7931771f","Type":"ContainerDied","Data":"2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9"} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.367218 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsndz" event={"ID":"aefd671c-4583-4057-aebd-1c8c7931771f","Type":"ContainerStarted","Data":"0b3aff508753457253d53bff688c1c73660e4a63ca6006b9a8f80e7b516510df"} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.376790 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.379522 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlz6" event={"ID":"0447b485-fa0f-470b-abb6-da0f406d0f7f","Type":"ContainerStarted","Data":"3334c1d9605018f202e9fe8c8560a079532b11ecd1985583373ed5c969f7dc37"} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.414771 4702 generic.go:334] "Generic (PLEG): container finished" podID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerID="de0b4fa7b6ad4afd5a84bedbba55e8a802f45ec9a27be098af7c6b8f685cf8b8" exitCode=0 Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.415629 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc4kd" event={"ID":"68547a12-2568-4a7a-a3e2-fde07134e6ee","Type":"ContainerDied","Data":"de0b4fa7b6ad4afd5a84bedbba55e8a802f45ec9a27be098af7c6b8f685cf8b8"} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.415667 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc4kd" event={"ID":"68547a12-2568-4a7a-a3e2-fde07134e6ee","Type":"ContainerStarted","Data":"111d5eb2d539cee87cb63c27856ef9b1ecc90069a76fd4ab8fcb17a8055494ac"} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.429049 4702 generic.go:334] "Generic (PLEG): container finished" podID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerID="b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4" exitCode=0 Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.429162 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7dh" event={"ID":"085dd40d-8d1f-40ce-903b-fbed55010a29","Type":"ContainerDied","Data":"b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4"} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.429197 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7dh" event={"ID":"085dd40d-8d1f-40ce-903b-fbed55010a29","Type":"ContainerStarted","Data":"d663ec0afac96b2dd413b6c838142fdc40f4f356a9475c53e4d546ef148669d2"} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.439592 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llwlk\" (UniqueName: \"kubernetes.io/projected/ab09df47-f81e-4b91-aee1-89e919c149ee-kube-api-access-llwlk\") pod \"redhat-marketplace-tgvqz\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.493072 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerStarted","Data":"6c914a6b412546c50067dc6351057f63f959bc01730befb56cae49a0c30ca747"} Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.496663 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hfx4c"] Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.514084 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.523713 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.524345 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfx4c"] Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.588250 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.608945 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podStartSLOduration=13.608920705 podStartE2EDuration="13.608920705s" podCreationTimestamp="2025-12-03 11:05:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:15.558563397 +0000 UTC m=+99.394491861" watchObservedRunningTime="2025-12-03 11:05:15.608920705 +0000 UTC m=+99.444849169" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.673528 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-catalog-content\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.673610 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpgjk\" (UniqueName: \"kubernetes.io/projected/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-kube-api-access-dpgjk\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.673674 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-utilities\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.715863 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-lnnbr\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.743232 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.775728 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-catalog-content\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.775850 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpgjk\" (UniqueName: \"kubernetes.io/projected/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-kube-api-access-dpgjk\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.775971 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-utilities\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.776611 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-utilities\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.777520 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-catalog-content\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.800026 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpgjk\" (UniqueName: \"kubernetes.io/projected/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-kube-api-access-dpgjk\") pod \"redhat-marketplace-hfx4c\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.897805 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:05:15 crc kubenswrapper[4702]: I1203 11:05:15.984530 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.102323 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sn9wb"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.107537 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.110709 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.178821 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:16 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:16 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:16 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.178896 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.193531 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn9wb"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.205536 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-catalog-content\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.205641 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnt5j\" (UniqueName: \"kubernetes.io/projected/b806ad42-5c69-4ea6-8d36-fb54595132bf-kube-api-access-qnt5j\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.205687 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-utilities\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.313105 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnt5j\" (UniqueName: \"kubernetes.io/projected/b806ad42-5c69-4ea6-8d36-fb54595132bf-kube-api-access-qnt5j\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.313668 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-utilities\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.313812 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-catalog-content\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.314613 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-catalog-content\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.314846 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-utilities\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.343674 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnt5j\" (UniqueName: \"kubernetes.io/projected/b806ad42-5c69-4ea6-8d36-fb54595132bf-kube-api-access-qnt5j\") pod \"redhat-operators-sn9wb\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.359663 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgvqz"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.478623 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.481825 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fpqqw"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.488703 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.497718 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fpqqw"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.528506 4702 generic.go:334] "Generic (PLEG): container finished" podID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerID="fe93cc5fb352da55f03be118476e73f7002395876016d2234a843826b2409df2" exitCode=0 Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.528623 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlz6" event={"ID":"0447b485-fa0f-470b-abb6-da0f406d0f7f","Type":"ContainerDied","Data":"fe93cc5fb352da55f03be118476e73f7002395876016d2234a843826b2409df2"} Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.537947 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgvqz" event={"ID":"ab09df47-f81e-4b91-aee1-89e919c149ee","Type":"ContainerStarted","Data":"b3b69ab3891875aa94288bf2ef24962cefab96af624035b19356bc287a6018cc"} Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.551826 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff0e7d31-e532-4cb8-8eed-e72b38258e9d","Type":"ContainerStarted","Data":"5ed27981b9e7ccb75cfe3da86150fc250c669b3263d5554dcf2897108e285e59"} Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.555345 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnnbr"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.625271 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfx4c"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.626492 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-catalog-content\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.626568 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqs27\" (UniqueName: \"kubernetes.io/projected/eb339a22-530c-412c-8d5a-8f9c56ab096b-kube-api-access-fqs27\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.626605 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-utilities\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.729476 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqs27\" (UniqueName: \"kubernetes.io/projected/eb339a22-530c-412c-8d5a-8f9c56ab096b-kube-api-access-fqs27\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.729538 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-utilities\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.729621 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-catalog-content\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.730078 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-catalog-content\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.730243 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-utilities\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.761430 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqs27\" (UniqueName: \"kubernetes.io/projected/eb339a22-530c-412c-8d5a-8f9c56ab096b-kube-api-access-fqs27\") pod \"redhat-operators-fpqqw\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.842770 4702 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5xk7q container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]log ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]etcd ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/max-in-flight-filter ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 03 11:05:16 crc kubenswrapper[4702]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 03 11:05:16 crc kubenswrapper[4702]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/project.openshift.io-projectcache ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 03 11:05:16 crc kubenswrapper[4702]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 03 11:05:16 crc kubenswrapper[4702]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 11:05:16 crc kubenswrapper[4702]: livez check failed Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.842845 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" podUID="adc43b14-86cb-4ff5-b7fb-a9ba32cde631" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.861379 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.912162 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sn9wb"] Dec 03 11:05:16 crc kubenswrapper[4702]: I1203 11:05:16.955880 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.175167 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:17 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:17 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:17 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.175527 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.266549 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fpqqw"] Dec 03 11:05:17 crc kubenswrapper[4702]: W1203 11:05:17.298710 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb339a22_530c_412c_8d5a_8f9c56ab096b.slice/crio-691efbc93d85d643d5ec1eef619c643f7ea59e6fb3e7fa439d39062a35c148d6 WatchSource:0}: Error finding container 691efbc93d85d643d5ec1eef619c643f7ea59e6fb3e7fa439d39062a35c148d6: Status 404 returned error can't find the container with id 691efbc93d85d643d5ec1eef619c643f7ea59e6fb3e7fa439d39062a35c148d6 Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.586145 4702 generic.go:334] "Generic (PLEG): container finished" podID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerID="446777d93e94054be046cbfd4589a5862c471836c0610ae4c70f2543dbfce409" exitCode=0 Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.586737 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfx4c" event={"ID":"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf","Type":"ContainerDied","Data":"446777d93e94054be046cbfd4589a5862c471836c0610ae4c70f2543dbfce409"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.586796 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfx4c" event={"ID":"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf","Type":"ContainerStarted","Data":"53ce064e8b06f0a56b5a91f99c97fced35170aeb9056905a681968deedf43cfd"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.591286 4702 generic.go:334] "Generic (PLEG): container finished" podID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerID="0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695" exitCode=0 Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.591887 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9wb" event={"ID":"b806ad42-5c69-4ea6-8d36-fb54595132bf","Type":"ContainerDied","Data":"0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.591968 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9wb" event={"ID":"b806ad42-5c69-4ea6-8d36-fb54595132bf","Type":"ContainerStarted","Data":"997daa925690d08c1123ea85d5ec4a5586d0ac8810f0e88fbdd1a3adc5aaa02a"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.602161 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpqqw" event={"ID":"eb339a22-530c-412c-8d5a-8f9c56ab096b","Type":"ContainerStarted","Data":"691efbc93d85d643d5ec1eef619c643f7ea59e6fb3e7fa439d39062a35c148d6"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.612027 4702 generic.go:334] "Generic (PLEG): container finished" podID="ff0e7d31-e532-4cb8-8eed-e72b38258e9d" containerID="a055814e680cd5f37a746fcca075536d6d5a0a2aba969783d000a4c84ebb253b" exitCode=0 Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.612117 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff0e7d31-e532-4cb8-8eed-e72b38258e9d","Type":"ContainerDied","Data":"a055814e680cd5f37a746fcca075536d6d5a0a2aba969783d000a4c84ebb253b"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.628499 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" event={"ID":"77a69d25-2384-466b-b284-e36e979597b4","Type":"ContainerDied","Data":"b1abdb7d1dde37410aaaab35ec091adbaa25c05e540afd9716f5f7c13ddcd219"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.628649 4702 generic.go:334] "Generic (PLEG): container finished" podID="77a69d25-2384-466b-b284-e36e979597b4" containerID="b1abdb7d1dde37410aaaab35ec091adbaa25c05e540afd9716f5f7c13ddcd219" exitCode=0 Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.633371 4702 generic.go:334] "Generic (PLEG): container finished" podID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerID="5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7" exitCode=0 Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.633495 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgvqz" event={"ID":"ab09df47-f81e-4b91-aee1-89e919c149ee","Type":"ContainerDied","Data":"5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.687281 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" event={"ID":"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b","Type":"ContainerStarted","Data":"9cc460beecb02e23233dafa6af01c5deecb7be8f23e85f7b31f51207d3014214"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.687343 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" event={"ID":"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b","Type":"ContainerStarted","Data":"e99a811080cdd3677dc89f6831b1136fca6f5214a9099b8babc6b9b78d63a369"} Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.688262 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:17 crc kubenswrapper[4702]: I1203 11:05:17.768869 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" podStartSLOduration=78.768842904 podStartE2EDuration="1m18.768842904s" podCreationTimestamp="2025-12-03 11:03:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:17.722312796 +0000 UTC m=+101.558241250" watchObservedRunningTime="2025-12-03 11:05:17.768842904 +0000 UTC m=+101.604771368" Dec 03 11:05:18 crc kubenswrapper[4702]: I1203 11:05:18.094010 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:05:18 crc kubenswrapper[4702]: I1203 11:05:18.103585 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/11bb1bad-4b90-4366-9187-8d27480f670b-metrics-certs\") pod \"network-metrics-daemon-6jzjr\" (UID: \"11bb1bad-4b90-4366-9187-8d27480f670b\") " pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:05:18 crc kubenswrapper[4702]: I1203 11:05:18.176162 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:18 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:18 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:18 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:18 crc kubenswrapper[4702]: I1203 11:05:18.176253 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:18 crc kubenswrapper[4702]: I1203 11:05:18.377289 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6jzjr" Dec 03 11:05:18 crc kubenswrapper[4702]: I1203 11:05:18.717430 4702 generic.go:334] "Generic (PLEG): container finished" podID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerID="a129c7b8d2bd8cdedb556a73ca49a884a02c0d63b5e8661d280b9f9444e9e2b2" exitCode=0 Dec 03 11:05:18 crc kubenswrapper[4702]: I1203 11:05:18.719134 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpqqw" event={"ID":"eb339a22-530c-412c-8d5a-8f9c56ab096b","Type":"ContainerDied","Data":"a129c7b8d2bd8cdedb556a73ca49a884a02c0d63b5e8661d280b9f9444e9e2b2"} Dec 03 11:05:18 crc kubenswrapper[4702]: I1203 11:05:18.843983 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6jzjr"] Dec 03 11:05:18 crc kubenswrapper[4702]: W1203 11:05:18.871677 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11bb1bad_4b90_4366_9187_8d27480f670b.slice/crio-8a93dfe898eac8db51f9eb39c49bc107cef771e231cd631010bbe1902fdf5f81 WatchSource:0}: Error finding container 8a93dfe898eac8db51f9eb39c49bc107cef771e231cd631010bbe1902fdf5f81: Status 404 returned error can't find the container with id 8a93dfe898eac8db51f9eb39c49bc107cef771e231cd631010bbe1902fdf5f81 Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.174037 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:19 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:19 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:19 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.174108 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.231796 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.304795 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.319502 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a69d25-2384-466b-b284-e36e979597b4-secret-volume\") pod \"77a69d25-2384-466b-b284-e36e979597b4\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.319598 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a69d25-2384-466b-b284-e36e979597b4-config-volume\") pod \"77a69d25-2384-466b-b284-e36e979597b4\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.319637 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcdjp\" (UniqueName: \"kubernetes.io/projected/77a69d25-2384-466b-b284-e36e979597b4-kube-api-access-zcdjp\") pod \"77a69d25-2384-466b-b284-e36e979597b4\" (UID: \"77a69d25-2384-466b-b284-e36e979597b4\") " Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.321440 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a69d25-2384-466b-b284-e36e979597b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "77a69d25-2384-466b-b284-e36e979597b4" (UID: "77a69d25-2384-466b-b284-e36e979597b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.326852 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a69d25-2384-466b-b284-e36e979597b4-kube-api-access-zcdjp" (OuterVolumeSpecName: "kube-api-access-zcdjp") pod "77a69d25-2384-466b-b284-e36e979597b4" (UID: "77a69d25-2384-466b-b284-e36e979597b4"). InnerVolumeSpecName "kube-api-access-zcdjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.327351 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a69d25-2384-466b-b284-e36e979597b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "77a69d25-2384-466b-b284-e36e979597b4" (UID: "77a69d25-2384-466b-b284-e36e979597b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.421246 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kubelet-dir\") pod \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\" (UID: \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\") " Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.421323 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kube-api-access\") pod \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\" (UID: \"ff0e7d31-e532-4cb8-8eed-e72b38258e9d\") " Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.421348 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ff0e7d31-e532-4cb8-8eed-e72b38258e9d" (UID: "ff0e7d31-e532-4cb8-8eed-e72b38258e9d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.421689 4702 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/77a69d25-2384-466b-b284-e36e979597b4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.421705 4702 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.421715 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77a69d25-2384-466b-b284-e36e979597b4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.421725 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcdjp\" (UniqueName: \"kubernetes.io/projected/77a69d25-2384-466b-b284-e36e979597b4-kube-api-access-zcdjp\") on node \"crc\" DevicePath \"\"" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.430004 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ff0e7d31-e532-4cb8-8eed-e72b38258e9d" (UID: "ff0e7d31-e532-4cb8-8eed-e72b38258e9d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.529567 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ff0e7d31-e532-4cb8-8eed-e72b38258e9d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.734087 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" event={"ID":"11bb1bad-4b90-4366-9187-8d27480f670b","Type":"ContainerStarted","Data":"8a93dfe898eac8db51f9eb39c49bc107cef771e231cd631010bbe1902fdf5f81"} Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.752880 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ff0e7d31-e532-4cb8-8eed-e72b38258e9d","Type":"ContainerDied","Data":"5ed27981b9e7ccb75cfe3da86150fc250c669b3263d5554dcf2897108e285e59"} Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.752941 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ed27981b9e7ccb75cfe3da86150fc250c669b3263d5554dcf2897108e285e59" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.753036 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.777167 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.777321 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n" event={"ID":"77a69d25-2384-466b-b284-e36e979597b4","Type":"ContainerDied","Data":"9f313546a49233d1908e3f6bfa418f5cb8d5cf67b481edfec850145c77e9bd9f"} Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.777354 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f313546a49233d1908e3f6bfa418f5cb8d5cf67b481edfec850145c77e9bd9f" Dec 03 11:05:19 crc kubenswrapper[4702]: I1203 11:05:19.925546 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.026766 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.176448 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:20 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:20 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:20 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.176546 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.644015 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-f8vb8" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.815336 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 11:05:20 crc kubenswrapper[4702]: E1203 11:05:20.815695 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a69d25-2384-466b-b284-e36e979597b4" containerName="collect-profiles" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.815709 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a69d25-2384-466b-b284-e36e979597b4" containerName="collect-profiles" Dec 03 11:05:20 crc kubenswrapper[4702]: E1203 11:05:20.817199 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0e7d31-e532-4cb8-8eed-e72b38258e9d" containerName="pruner" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.817220 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0e7d31-e532-4cb8-8eed-e72b38258e9d" containerName="pruner" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.817526 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0e7d31-e532-4cb8-8eed-e72b38258e9d" containerName="pruner" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.817553 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a69d25-2384-466b-b284-e36e979597b4" containerName="collect-profiles" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.818032 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.823938 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.824335 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.829412 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.914725 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" event={"ID":"11bb1bad-4b90-4366-9187-8d27480f670b","Type":"ContainerStarted","Data":"0476dadf1714ad2ca8a84afe66216eaaca9d270765ad02ab113fd0d7e4fabccf"} Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.990821 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:20 crc kubenswrapper[4702]: I1203 11:05:20.991477 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.093179 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.093679 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.094555 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.118780 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.149018 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.170771 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:21 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:21 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:21 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.170858 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.639098 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 11:05:21 crc kubenswrapper[4702]: W1203 11:05:21.661942 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf4ca7269_2a74_4362_ab12_3064f5f9d7a2.slice/crio-8b727eb96ad41e52b443465fa6cf6207496238c57e96e86145de64ec25d8ad42 WatchSource:0}: Error finding container 8b727eb96ad41e52b443465fa6cf6207496238c57e96e86145de64ec25d8ad42: Status 404 returned error can't find the container with id 8b727eb96ad41e52b443465fa6cf6207496238c57e96e86145de64ec25d8ad42 Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.939769 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f4ca7269-2a74-4362-ab12-3064f5f9d7a2","Type":"ContainerStarted","Data":"8b727eb96ad41e52b443465fa6cf6207496238c57e96e86145de64ec25d8ad42"} Dec 03 11:05:21 crc kubenswrapper[4702]: I1203 11:05:21.956635 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6jzjr" event={"ID":"11bb1bad-4b90-4366-9187-8d27480f670b","Type":"ContainerStarted","Data":"88761c1b1ff4ab811bb60110af7bcc020d90a70bc3085d3d3daffeb7f7bfacc0"} Dec 03 11:05:22 crc kubenswrapper[4702]: I1203 11:05:22.038488 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6jzjr" podStartSLOduration=84.03846992 podStartE2EDuration="1m24.03846992s" podCreationTimestamp="2025-12-03 11:03:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:22.029055014 +0000 UTC m=+105.864983478" watchObservedRunningTime="2025-12-03 11:05:22.03846992 +0000 UTC m=+105.874398384" Dec 03 11:05:22 crc kubenswrapper[4702]: I1203 11:05:22.187145 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:22 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:22 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:22 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:22 crc kubenswrapper[4702]: I1203 11:05:22.187213 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:23 crc kubenswrapper[4702]: I1203 11:05:23.170181 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:23 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:23 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:23 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:23 crc kubenswrapper[4702]: I1203 11:05:23.170263 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:24 crc kubenswrapper[4702]: I1203 11:05:24.178849 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:24 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:24 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:24 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:24 crc kubenswrapper[4702]: I1203 11:05:24.179288 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:24 crc kubenswrapper[4702]: I1203 11:05:24.900572 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f4ca7269-2a74-4362-ab12-3064f5f9d7a2","Type":"ContainerStarted","Data":"e2cdd91bfd1ea708c40feada5edfa666d7b7fd421eea423c03eda2e71807f135"} Dec 03 11:05:24 crc kubenswrapper[4702]: I1203 11:05:24.947608 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.947583561 podStartE2EDuration="4.947583561s" podCreationTimestamp="2025-12-03 11:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:05:24.944743717 +0000 UTC m=+108.780672181" watchObservedRunningTime="2025-12-03 11:05:24.947583561 +0000 UTC m=+108.783512025" Dec 03 11:05:25 crc kubenswrapper[4702]: I1203 11:05:25.128315 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:25 crc kubenswrapper[4702]: I1203 11:05:25.128348 4702 patch_prober.go:28] interesting pod/console-f9d7485db-ccdtg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Dec 03 11:05:25 crc kubenswrapper[4702]: I1203 11:05:25.128383 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:25 crc kubenswrapper[4702]: I1203 11:05:25.128427 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ccdtg" podUID="761a509d-5cb8-4506-901c-614a7d633d39" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Dec 03 11:05:25 crc kubenswrapper[4702]: I1203 11:05:25.128461 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:25 crc kubenswrapper[4702]: I1203 11:05:25.128484 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:25 crc kubenswrapper[4702]: I1203 11:05:25.209652 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:25 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:25 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:25 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:25 crc kubenswrapper[4702]: I1203 11:05:25.209817 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:26 crc kubenswrapper[4702]: I1203 11:05:26.325233 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:26 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:26 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:26 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:26 crc kubenswrapper[4702]: I1203 11:05:26.325292 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:26 crc kubenswrapper[4702]: I1203 11:05:26.339575 4702 generic.go:334] "Generic (PLEG): container finished" podID="f4ca7269-2a74-4362-ab12-3064f5f9d7a2" containerID="e2cdd91bfd1ea708c40feada5edfa666d7b7fd421eea423c03eda2e71807f135" exitCode=0 Dec 03 11:05:26 crc kubenswrapper[4702]: I1203 11:05:26.339623 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f4ca7269-2a74-4362-ab12-3064f5f9d7a2","Type":"ContainerDied","Data":"e2cdd91bfd1ea708c40feada5edfa666d7b7fd421eea423c03eda2e71807f135"} Dec 03 11:05:27 crc kubenswrapper[4702]: I1203 11:05:27.172150 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 11:05:27 crc kubenswrapper[4702]: [-]has-synced failed: reason withheld Dec 03 11:05:27 crc kubenswrapper[4702]: [+]process-running ok Dec 03 11:05:27 crc kubenswrapper[4702]: healthz check failed Dec 03 11:05:27 crc kubenswrapper[4702]: I1203 11:05:27.172512 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 11:05:28 crc kubenswrapper[4702]: I1203 11:05:28.171690 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:28 crc kubenswrapper[4702]: I1203 11:05:28.176821 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 11:05:29 crc kubenswrapper[4702]: I1203 11:05:29.599086 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.146291 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.146816 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.147154 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.147189 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.147244 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.148103 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.148208 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.148216 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"b19709c46ee9bdb2df0ae0064f1db24583ec1036f7312805bc2da46e7287368d"} pod="openshift-console/downloads-7954f5f757-hndf6" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.148400 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" containerID="cri-o://b19709c46ee9bdb2df0ae0064f1db24583ec1036f7312805bc2da46e7287368d" gracePeriod=2 Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.333691 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.339611 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:05:35 crc kubenswrapper[4702]: I1203 11:05:35.753573 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:05:37 crc kubenswrapper[4702]: I1203 11:05:37.617351 4702 generic.go:334] "Generic (PLEG): container finished" podID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerID="b19709c46ee9bdb2df0ae0064f1db24583ec1036f7312805bc2da46e7287368d" exitCode=0 Dec 03 11:05:37 crc kubenswrapper[4702]: I1203 11:05:37.617434 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerDied","Data":"b19709c46ee9bdb2df0ae0064f1db24583ec1036f7312805bc2da46e7287368d"} Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.127736 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.128405 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.454632 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.554592 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kubelet-dir\") pod \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\" (UID: \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\") " Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.554702 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kube-api-access\") pod \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\" (UID: \"f4ca7269-2a74-4362-ab12-3064f5f9d7a2\") " Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.571665 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f4ca7269-2a74-4362-ab12-3064f5f9d7a2" (UID: "f4ca7269-2a74-4362-ab12-3064f5f9d7a2"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.608922 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.626940 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f4ca7269-2a74-4362-ab12-3064f5f9d7a2" (UID: "f4ca7269-2a74-4362-ab12-3064f5f9d7a2"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.656133 4702 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.656169 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f4ca7269-2a74-4362-ab12-3064f5f9d7a2-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.696722 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"f4ca7269-2a74-4362-ab12-3064f5f9d7a2","Type":"ContainerDied","Data":"8b727eb96ad41e52b443465fa6cf6207496238c57e96e86145de64ec25d8ad42"} Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.696792 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b727eb96ad41e52b443465fa6cf6207496238c57e96e86145de64ec25d8ad42" Dec 03 11:05:45 crc kubenswrapper[4702]: I1203 11:05:45.696893 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 11:05:55 crc kubenswrapper[4702]: I1203 11:05:55.171938 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:05:55 crc kubenswrapper[4702]: I1203 11:05:55.172566 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.607559 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 11:05:56 crc kubenswrapper[4702]: E1203 11:05:56.607923 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ca7269-2a74-4362-ab12-3064f5f9d7a2" containerName="pruner" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.607940 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ca7269-2a74-4362-ab12-3064f5f9d7a2" containerName="pruner" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.608063 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ca7269-2a74-4362-ab12-3064f5f9d7a2" containerName="pruner" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.608482 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.610663 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.611315 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.619222 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.689174 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/113264ad-a66d-41d5-95a7-0fd255036eee-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"113264ad-a66d-41d5-95a7-0fd255036eee\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.689256 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/113264ad-a66d-41d5-95a7-0fd255036eee-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"113264ad-a66d-41d5-95a7-0fd255036eee\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.791027 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/113264ad-a66d-41d5-95a7-0fd255036eee-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"113264ad-a66d-41d5-95a7-0fd255036eee\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.791094 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/113264ad-a66d-41d5-95a7-0fd255036eee-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"113264ad-a66d-41d5-95a7-0fd255036eee\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.791540 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/113264ad-a66d-41d5-95a7-0fd255036eee-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"113264ad-a66d-41d5-95a7-0fd255036eee\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.811111 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/113264ad-a66d-41d5-95a7-0fd255036eee-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"113264ad-a66d-41d5-95a7-0fd255036eee\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:05:56 crc kubenswrapper[4702]: I1203 11:05:56.935350 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.803989 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.805453 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.828456 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.828844 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-var-lock\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.828895 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.828913 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc99dba0-244c-46eb-b4fb-b08679d5431b-kube-api-access\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.930236 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-var-lock\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.930299 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.930321 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc99dba0-244c-46eb-b4fb-b08679d5431b-kube-api-access\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.930898 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-var-lock\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.930948 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-kubelet-dir\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:00 crc kubenswrapper[4702]: I1203 11:06:00.963354 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc99dba0-244c-46eb-b4fb-b08679d5431b-kube-api-access\") pod \"installer-9-crc\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:01 crc kubenswrapper[4702]: I1203 11:06:01.135219 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:02 crc kubenswrapper[4702]: E1203 11:06:02.401571 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 11:06:02 crc kubenswrapper[4702]: E1203 11:06:02.401785 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqs27,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fpqqw_openshift-marketplace(eb339a22-530c-412c-8d5a-8f9c56ab096b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:06:02 crc kubenswrapper[4702]: E1203 11:06:02.403222 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fpqqw" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" Dec 03 11:06:03 crc kubenswrapper[4702]: E1203 11:06:03.839223 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fpqqw" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" Dec 03 11:06:03 crc kubenswrapper[4702]: E1203 11:06:03.948496 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 11:06:03 crc kubenswrapper[4702]: E1203 11:06:03.948838 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-llwlk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-tgvqz_openshift-marketplace(ab09df47-f81e-4b91-aee1-89e919c149ee): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:06:03 crc kubenswrapper[4702]: E1203 11:06:03.949928 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-tgvqz" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" Dec 03 11:06:03 crc kubenswrapper[4702]: E1203 11:06:03.967054 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 11:06:03 crc kubenswrapper[4702]: E1203 11:06:03.967214 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fvxt9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rvlz6_openshift-marketplace(0447b485-fa0f-470b-abb6-da0f406d0f7f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:06:03 crc kubenswrapper[4702]: E1203 11:06:03.969044 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rvlz6" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.003318 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.004389 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qnt5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-sn9wb_openshift-marketplace(b806ad42-5c69-4ea6-8d36-fb54595132bf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.006599 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-sn9wb" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.011917 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.012113 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9g8s5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wv7dh_openshift-marketplace(085dd40d-8d1f-40ce-903b-fbed55010a29): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.013868 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wv7dh" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.365047 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 11:06:04 crc kubenswrapper[4702]: W1203 11:06:04.400778 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod113264ad_a66d_41d5_95a7_0fd255036eee.slice/crio-986527f20338150ebaecb559f0612fbf9735eaac127bc77e6e0448444f217ecc WatchSource:0}: Error finding container 986527f20338150ebaecb559f0612fbf9735eaac127bc77e6e0448444f217ecc: Status 404 returned error can't find the container with id 986527f20338150ebaecb559f0612fbf9735eaac127bc77e6e0448444f217ecc Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.415661 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 11:06:04 crc kubenswrapper[4702]: W1203 11:06:04.423149 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc99dba0_244c_46eb_b4fb_b08679d5431b.slice/crio-91086eb5a2078d57314414da08612e7acf017096a87197b6e55e3fb6148c789d WatchSource:0}: Error finding container 91086eb5a2078d57314414da08612e7acf017096a87197b6e55e3fb6148c789d: Status 404 returned error can't find the container with id 91086eb5a2078d57314414da08612e7acf017096a87197b6e55e3fb6148c789d Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.808657 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerStarted","Data":"ccb878a03690ecae62364ca11922e2d74ca8a555b1b3a95db9a68827c0d60357"} Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.809819 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.810820 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.810871 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.812035 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"113264ad-a66d-41d5-95a7-0fd255036eee","Type":"ContainerStarted","Data":"986527f20338150ebaecb559f0612fbf9735eaac127bc77e6e0448444f217ecc"} Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.814776 4702 generic.go:334] "Generic (PLEG): container finished" podID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerID="56e61dec0186394fe00cdac1f4fdd2f3034e898bdbdf2e287e21a06a402d60f7" exitCode=0 Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.814846 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfx4c" event={"ID":"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf","Type":"ContainerDied","Data":"56e61dec0186394fe00cdac1f4fdd2f3034e898bdbdf2e287e21a06a402d60f7"} Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.817445 4702 generic.go:334] "Generic (PLEG): container finished" podID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerID="8072c00e5d20150bb5e5c0ea2b0279f89f0827db1abc56425e771cf1892272bb" exitCode=0 Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.817491 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc4kd" event={"ID":"68547a12-2568-4a7a-a3e2-fde07134e6ee","Type":"ContainerDied","Data":"8072c00e5d20150bb5e5c0ea2b0279f89f0827db1abc56425e771cf1892272bb"} Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.820419 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bc99dba0-244c-46eb-b4fb-b08679d5431b","Type":"ContainerStarted","Data":"91086eb5a2078d57314414da08612e7acf017096a87197b6e55e3fb6148c789d"} Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.825135 4702 generic.go:334] "Generic (PLEG): container finished" podID="aefd671c-4583-4057-aebd-1c8c7931771f" containerID="c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e" exitCode=0 Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.825301 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsndz" event={"ID":"aefd671c-4583-4057-aebd-1c8c7931771f","Type":"ContainerDied","Data":"c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e"} Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.828077 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rvlz6" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.831329 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-tgvqz" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.833180 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wv7dh" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" Dec 03 11:06:04 crc kubenswrapper[4702]: E1203 11:06:04.837039 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-sn9wb" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.889003 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.889091 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.889129 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.889190 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.891789 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.892137 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.892611 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.902896 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.905775 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.916995 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.917115 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:06:04 crc kubenswrapper[4702]: I1203 11:06:04.920479 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.087124 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.128038 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.128574 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.128469 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.128743 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.154129 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.168091 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:06:05 crc kubenswrapper[4702]: W1203 11:06:05.554011 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ef2567b378432a6e78c3d9e326ac0eaff7abb827b8ae4db375e568413229f29c WatchSource:0}: Error finding container ef2567b378432a6e78c3d9e326ac0eaff7abb827b8ae4db375e568413229f29c: Status 404 returned error can't find the container with id ef2567b378432a6e78c3d9e326ac0eaff7abb827b8ae4db375e568413229f29c Dec 03 11:06:05 crc kubenswrapper[4702]: W1203 11:06:05.729737 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-3d886083081e83098618754613eccb4faa452b0196ddde2bc9a9e6cb2f542145 WatchSource:0}: Error finding container 3d886083081e83098618754613eccb4faa452b0196ddde2bc9a9e6cb2f542145: Status 404 returned error can't find the container with id 3d886083081e83098618754613eccb4faa452b0196ddde2bc9a9e6cb2f542145 Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.836605 4702 generic.go:334] "Generic (PLEG): container finished" podID="113264ad-a66d-41d5-95a7-0fd255036eee" containerID="6b37901b70b6e73dbd484af9f5cfb82882cb2b1ebc325a4be047cde7a0ea0797" exitCode=0 Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.836683 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"113264ad-a66d-41d5-95a7-0fd255036eee","Type":"ContainerDied","Data":"6b37901b70b6e73dbd484af9f5cfb82882cb2b1ebc325a4be047cde7a0ea0797"} Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.838618 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ef2567b378432a6e78c3d9e326ac0eaff7abb827b8ae4db375e568413229f29c"} Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.841306 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3d886083081e83098618754613eccb4faa452b0196ddde2bc9a9e6cb2f542145"} Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.843378 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bc99dba0-244c-46eb-b4fb-b08679d5431b","Type":"ContainerStarted","Data":"e356ba97f9a0df021090dfe9f8ad51b2fc373f71d56e6aa29b57bfbed203334b"} Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.845641 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8b426e15eb14310cf1c766c4ab86eb6ad4bc58bdd1069e15ccaea82880db3665"} Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.846226 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:06:05 crc kubenswrapper[4702]: I1203 11:06:05.846278 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:06:06 crc kubenswrapper[4702]: I1203 11:06:06.855417 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"01aa6eca930a59045660c0430a904c70adcfbefa9135ed60ca79ad2472ed6b3c"} Dec 03 11:06:06 crc kubenswrapper[4702]: I1203 11:06:06.859358 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"65c3f6c10985ba357a047891cf45b89914b579d633b7d9f977f0017015207691"} Dec 03 11:06:06 crc kubenswrapper[4702]: I1203 11:06:06.861281 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"249d90546b108d33e820a4b7edc8da977f0a54ad7cd5eec5b0648c5137e90a2c"} Dec 03 11:06:06 crc kubenswrapper[4702]: I1203 11:06:06.862246 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:06:06 crc kubenswrapper[4702]: I1203 11:06:06.862291 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:06:06 crc kubenswrapper[4702]: I1203 11:06:06.879303 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=6.8792767900000005 podStartE2EDuration="6.87927679s" podCreationTimestamp="2025-12-03 11:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:06:05.877080263 +0000 UTC m=+149.713008727" watchObservedRunningTime="2025-12-03 11:06:06.87927679 +0000 UTC m=+150.715205254" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.270954 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.435502 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/113264ad-a66d-41d5-95a7-0fd255036eee-kubelet-dir\") pod \"113264ad-a66d-41d5-95a7-0fd255036eee\" (UID: \"113264ad-a66d-41d5-95a7-0fd255036eee\") " Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.435604 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/113264ad-a66d-41d5-95a7-0fd255036eee-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "113264ad-a66d-41d5-95a7-0fd255036eee" (UID: "113264ad-a66d-41d5-95a7-0fd255036eee"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.435665 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/113264ad-a66d-41d5-95a7-0fd255036eee-kube-api-access\") pod \"113264ad-a66d-41d5-95a7-0fd255036eee\" (UID: \"113264ad-a66d-41d5-95a7-0fd255036eee\") " Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.436001 4702 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/113264ad-a66d-41d5-95a7-0fd255036eee-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.451095 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/113264ad-a66d-41d5-95a7-0fd255036eee-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "113264ad-a66d-41d5-95a7-0fd255036eee" (UID: "113264ad-a66d-41d5-95a7-0fd255036eee"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.537991 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/113264ad-a66d-41d5-95a7-0fd255036eee-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.868967 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfx4c" event={"ID":"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf","Type":"ContainerStarted","Data":"51f21c46ed75a4b250d5f3e1591c10113b9ec5273836be335cf974fe879028e7"} Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.875815 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"113264ad-a66d-41d5-95a7-0fd255036eee","Type":"ContainerDied","Data":"986527f20338150ebaecb559f0612fbf9735eaac127bc77e6e0448444f217ecc"} Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.875881 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="986527f20338150ebaecb559f0612fbf9735eaac127bc77e6e0448444f217ecc" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.876093 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.876876 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 11:06:07 crc kubenswrapper[4702]: I1203 11:06:07.896750 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hfx4c" podStartSLOduration=3.272833306 podStartE2EDuration="52.896714686s" podCreationTimestamp="2025-12-03 11:05:15 +0000 UTC" firstStartedPulling="2025-12-03 11:05:17.590010671 +0000 UTC m=+101.425939135" lastFinishedPulling="2025-12-03 11:06:07.213892051 +0000 UTC m=+151.049820515" observedRunningTime="2025-12-03 11:06:07.89162453 +0000 UTC m=+151.727553014" watchObservedRunningTime="2025-12-03 11:06:07.896714686 +0000 UTC m=+151.732643150" Dec 03 11:06:08 crc kubenswrapper[4702]: I1203 11:06:08.885718 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc4kd" event={"ID":"68547a12-2568-4a7a-a3e2-fde07134e6ee","Type":"ContainerStarted","Data":"5451e0e5f6514e74fed5766fd087aa6175feeb1ef6c3e7c9356976845d0bf38b"} Dec 03 11:06:09 crc kubenswrapper[4702]: I1203 11:06:09.920337 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mc4kd" podStartSLOduration=4.074648422 podStartE2EDuration="56.920318556s" podCreationTimestamp="2025-12-03 11:05:13 +0000 UTC" firstStartedPulling="2025-12-03 11:05:15.418943943 +0000 UTC m=+99.254872407" lastFinishedPulling="2025-12-03 11:06:08.264614077 +0000 UTC m=+152.100542541" observedRunningTime="2025-12-03 11:06:09.917300776 +0000 UTC m=+153.753229230" watchObservedRunningTime="2025-12-03 11:06:09.920318556 +0000 UTC m=+153.756247020" Dec 03 11:06:10 crc kubenswrapper[4702]: I1203 11:06:10.904001 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsndz" event={"ID":"aefd671c-4583-4057-aebd-1c8c7931771f","Type":"ContainerStarted","Data":"eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510"} Dec 03 11:06:10 crc kubenswrapper[4702]: I1203 11:06:10.938449 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jsndz" podStartSLOduration=4.899612875 podStartE2EDuration="58.938417699s" podCreationTimestamp="2025-12-03 11:05:12 +0000 UTC" firstStartedPulling="2025-12-03 11:05:15.376315449 +0000 UTC m=+99.212243913" lastFinishedPulling="2025-12-03 11:06:09.415120273 +0000 UTC m=+153.251048737" observedRunningTime="2025-12-03 11:06:10.935939863 +0000 UTC m=+154.771868357" watchObservedRunningTime="2025-12-03 11:06:10.938417699 +0000 UTC m=+154.774346163" Dec 03 11:06:13 crc kubenswrapper[4702]: I1203 11:06:13.215270 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:06:13 crc kubenswrapper[4702]: I1203 11:06:13.215798 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:06:13 crc kubenswrapper[4702]: I1203 11:06:13.706230 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:06:13 crc kubenswrapper[4702]: I1203 11:06:13.707378 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:06:14 crc kubenswrapper[4702]: I1203 11:06:14.671794 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:06:14 crc kubenswrapper[4702]: I1203 11:06:14.673785 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:06:14 crc kubenswrapper[4702]: I1203 11:06:14.722037 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:06:15 crc kubenswrapper[4702]: I1203 11:06:15.128468 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:06:15 crc kubenswrapper[4702]: I1203 11:06:15.128562 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:06:15 crc kubenswrapper[4702]: I1203 11:06:15.128664 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 11:06:15 crc kubenswrapper[4702]: I1203 11:06:15.128779 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 11:06:15 crc kubenswrapper[4702]: I1203 11:06:15.899736 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:06:15 crc kubenswrapper[4702]: I1203 11:06:15.899891 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:06:16 crc kubenswrapper[4702]: I1203 11:06:16.173103 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:06:16 crc kubenswrapper[4702]: I1203 11:06:16.229170 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:06:16 crc kubenswrapper[4702]: I1203 11:06:16.478539 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc4kd"] Dec 03 11:06:16 crc kubenswrapper[4702]: I1203 11:06:16.479792 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mc4kd" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerName="registry-server" containerID="cri-o://5451e0e5f6514e74fed5766fd087aa6175feeb1ef6c3e7c9356976845d0bf38b" gracePeriod=2 Dec 03 11:06:17 crc kubenswrapper[4702]: I1203 11:06:17.881161 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfx4c"] Dec 03 11:06:17 crc kubenswrapper[4702]: I1203 11:06:17.966866 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hfx4c" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerName="registry-server" containerID="cri-o://51f21c46ed75a4b250d5f3e1591c10113b9ec5273836be335cf974fe879028e7" gracePeriod=2 Dec 03 11:06:18 crc kubenswrapper[4702]: I1203 11:06:18.991374 4702 generic.go:334] "Generic (PLEG): container finished" podID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerID="5451e0e5f6514e74fed5766fd087aa6175feeb1ef6c3e7c9356976845d0bf38b" exitCode=0 Dec 03 11:06:18 crc kubenswrapper[4702]: I1203 11:06:18.991462 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc4kd" event={"ID":"68547a12-2568-4a7a-a3e2-fde07134e6ee","Type":"ContainerDied","Data":"5451e0e5f6514e74fed5766fd087aa6175feeb1ef6c3e7c9356976845d0bf38b"} Dec 03 11:06:18 crc kubenswrapper[4702]: I1203 11:06:18.993842 4702 generic.go:334] "Generic (PLEG): container finished" podID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerID="51f21c46ed75a4b250d5f3e1591c10113b9ec5273836be335cf974fe879028e7" exitCode=0 Dec 03 11:06:18 crc kubenswrapper[4702]: I1203 11:06:18.993874 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfx4c" event={"ID":"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf","Type":"ContainerDied","Data":"51f21c46ed75a4b250d5f3e1591c10113b9ec5273836be335cf974fe879028e7"} Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.298894 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.426557 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-utilities\") pod \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.426662 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpgjk\" (UniqueName: \"kubernetes.io/projected/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-kube-api-access-dpgjk\") pod \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.426857 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-catalog-content\") pod \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\" (UID: \"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf\") " Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.429355 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-utilities" (OuterVolumeSpecName: "utilities") pod "74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" (UID: "74352ae3-f7cb-4e98-9f31-d0df7f9a9baf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.437131 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-kube-api-access-dpgjk" (OuterVolumeSpecName: "kube-api-access-dpgjk") pod "74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" (UID: "74352ae3-f7cb-4e98-9f31-d0df7f9a9baf"). InnerVolumeSpecName "kube-api-access-dpgjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.450048 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" (UID: "74352ae3-f7cb-4e98-9f31-d0df7f9a9baf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.529135 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.529176 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpgjk\" (UniqueName: \"kubernetes.io/projected/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-kube-api-access-dpgjk\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.529190 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.680160 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.833154 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2swn\" (UniqueName: \"kubernetes.io/projected/68547a12-2568-4a7a-a3e2-fde07134e6ee-kube-api-access-m2swn\") pod \"68547a12-2568-4a7a-a3e2-fde07134e6ee\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.833369 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-catalog-content\") pod \"68547a12-2568-4a7a-a3e2-fde07134e6ee\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.833452 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-utilities\") pod \"68547a12-2568-4a7a-a3e2-fde07134e6ee\" (UID: \"68547a12-2568-4a7a-a3e2-fde07134e6ee\") " Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.834477 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-utilities" (OuterVolumeSpecName: "utilities") pod "68547a12-2568-4a7a-a3e2-fde07134e6ee" (UID: "68547a12-2568-4a7a-a3e2-fde07134e6ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.842539 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68547a12-2568-4a7a-a3e2-fde07134e6ee-kube-api-access-m2swn" (OuterVolumeSpecName: "kube-api-access-m2swn") pod "68547a12-2568-4a7a-a3e2-fde07134e6ee" (UID: "68547a12-2568-4a7a-a3e2-fde07134e6ee"). InnerVolumeSpecName "kube-api-access-m2swn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.898498 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68547a12-2568-4a7a-a3e2-fde07134e6ee" (UID: "68547a12-2568-4a7a-a3e2-fde07134e6ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.938101 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.938256 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2swn\" (UniqueName: \"kubernetes.io/projected/68547a12-2568-4a7a-a3e2-fde07134e6ee-kube-api-access-m2swn\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:19 crc kubenswrapper[4702]: I1203 11:06:19.938353 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68547a12-2568-4a7a-a3e2-fde07134e6ee-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.004183 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mc4kd" event={"ID":"68547a12-2568-4a7a-a3e2-fde07134e6ee","Type":"ContainerDied","Data":"111d5eb2d539cee87cb63c27856ef9b1ecc90069a76fd4ab8fcb17a8055494ac"} Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.004244 4702 scope.go:117] "RemoveContainer" containerID="5451e0e5f6514e74fed5766fd087aa6175feeb1ef6c3e7c9356976845d0bf38b" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.004366 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mc4kd" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.009099 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpqqw" event={"ID":"eb339a22-530c-412c-8d5a-8f9c56ab096b","Type":"ContainerStarted","Data":"b443e21f51852a32594c3c8931400625a6d6890bfbcde1e322af6494d9c3e78d"} Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.015963 4702 generic.go:334] "Generic (PLEG): container finished" podID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerID="36b0bbfac2157e64e634410d7b5e02ebfe13b7288fc8ac3ecee8943520740602" exitCode=0 Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.016267 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlz6" event={"ID":"0447b485-fa0f-470b-abb6-da0f406d0f7f","Type":"ContainerDied","Data":"36b0bbfac2157e64e634410d7b5e02ebfe13b7288fc8ac3ecee8943520740602"} Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.018971 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgvqz" event={"ID":"ab09df47-f81e-4b91-aee1-89e919c149ee","Type":"ContainerStarted","Data":"15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1"} Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.022738 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hfx4c" event={"ID":"74352ae3-f7cb-4e98-9f31-d0df7f9a9baf","Type":"ContainerDied","Data":"53ce064e8b06f0a56b5a91f99c97fced35170aeb9056905a681968deedf43cfd"} Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.022896 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hfx4c" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.036749 4702 scope.go:117] "RemoveContainer" containerID="8072c00e5d20150bb5e5c0ea2b0279f89f0827db1abc56425e771cf1892272bb" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.058405 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mc4kd"] Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.066669 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mc4kd"] Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.067731 4702 scope.go:117] "RemoveContainer" containerID="de0b4fa7b6ad4afd5a84bedbba55e8a802f45ec9a27be098af7c6b8f685cf8b8" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.094571 4702 scope.go:117] "RemoveContainer" containerID="51f21c46ed75a4b250d5f3e1591c10113b9ec5273836be335cf974fe879028e7" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.256172 4702 scope.go:117] "RemoveContainer" containerID="56e61dec0186394fe00cdac1f4fdd2f3034e898bdbdf2e287e21a06a402d60f7" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.256485 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfx4c"] Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.259600 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hfx4c"] Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.335265 4702 scope.go:117] "RemoveContainer" containerID="446777d93e94054be046cbfd4589a5862c471836c0610ae4c70f2543dbfce409" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.948374 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" path="/var/lib/kubelet/pods/68547a12-2568-4a7a-a3e2-fde07134e6ee/volumes" Dec 03 11:06:20 crc kubenswrapper[4702]: I1203 11:06:20.949146 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" path="/var/lib/kubelet/pods/74352ae3-f7cb-4e98-9f31-d0df7f9a9baf/volumes" Dec 03 11:06:21 crc kubenswrapper[4702]: I1203 11:06:21.037255 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlz6" event={"ID":"0447b485-fa0f-470b-abb6-da0f406d0f7f","Type":"ContainerStarted","Data":"81ebee0789e812500222883ba92d4c81c7ad73375a3efff52cd8378f395bcd91"} Dec 03 11:06:21 crc kubenswrapper[4702]: I1203 11:06:21.040506 4702 generic.go:334] "Generic (PLEG): container finished" podID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerID="15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1" exitCode=0 Dec 03 11:06:21 crc kubenswrapper[4702]: I1203 11:06:21.040592 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgvqz" event={"ID":"ab09df47-f81e-4b91-aee1-89e919c149ee","Type":"ContainerDied","Data":"15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1"} Dec 03 11:06:21 crc kubenswrapper[4702]: I1203 11:06:21.046235 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7dh" event={"ID":"085dd40d-8d1f-40ce-903b-fbed55010a29","Type":"ContainerStarted","Data":"08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040"} Dec 03 11:06:22 crc kubenswrapper[4702]: I1203 11:06:22.126776 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9wb" event={"ID":"b806ad42-5c69-4ea6-8d36-fb54595132bf","Type":"ContainerStarted","Data":"6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73"} Dec 03 11:06:22 crc kubenswrapper[4702]: I1203 11:06:22.198047 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvlz6" podStartSLOduration=5.262506039 podStartE2EDuration="1m9.198015235s" podCreationTimestamp="2025-12-03 11:05:13 +0000 UTC" firstStartedPulling="2025-12-03 11:05:16.540307782 +0000 UTC m=+100.376236246" lastFinishedPulling="2025-12-03 11:06:20.475816978 +0000 UTC m=+164.311745442" observedRunningTime="2025-12-03 11:06:22.196834653 +0000 UTC m=+166.032763117" watchObservedRunningTime="2025-12-03 11:06:22.198015235 +0000 UTC m=+166.033943689" Dec 03 11:06:23 crc kubenswrapper[4702]: I1203 11:06:23.295723 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:06:23 crc kubenswrapper[4702]: I1203 11:06:23.901226 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:06:23 crc kubenswrapper[4702]: I1203 11:06:23.901325 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:06:24 crc kubenswrapper[4702]: I1203 11:06:24.197388 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgvqz" event={"ID":"ab09df47-f81e-4b91-aee1-89e919c149ee","Type":"ContainerStarted","Data":"cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd"} Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.069314 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rvlz6" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="registry-server" probeResult="failure" output=< Dec 03 11:06:25 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:06:25 crc kubenswrapper[4702]: > Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.145081 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.205806 4702 generic.go:334] "Generic (PLEG): container finished" podID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerID="08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040" exitCode=0 Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.205903 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7dh" event={"ID":"085dd40d-8d1f-40ce-903b-fbed55010a29","Type":"ContainerDied","Data":"08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040"} Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.210548 4702 generic.go:334] "Generic (PLEG): container finished" podID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerID="b443e21f51852a32594c3c8931400625a6d6890bfbcde1e322af6494d9c3e78d" exitCode=0 Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.210818 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpqqw" event={"ID":"eb339a22-530c-412c-8d5a-8f9c56ab096b","Type":"ContainerDied","Data":"b443e21f51852a32594c3c8931400625a6d6890bfbcde1e322af6494d9c3e78d"} Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.266957 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tgvqz" podStartSLOduration=6.339055501 podStartE2EDuration="1m10.266927456s" podCreationTimestamp="2025-12-03 11:05:15 +0000 UTC" firstStartedPulling="2025-12-03 11:05:17.637841706 +0000 UTC m=+101.473770170" lastFinishedPulling="2025-12-03 11:06:21.565713661 +0000 UTC m=+165.401642125" observedRunningTime="2025-12-03 11:06:25.265554999 +0000 UTC m=+169.101483473" watchObservedRunningTime="2025-12-03 11:06:25.266927456 +0000 UTC m=+169.102855930" Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.525285 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.525383 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.908020 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:06:25 crc kubenswrapper[4702]: I1203 11:06:25.908115 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:06:26 crc kubenswrapper[4702]: I1203 11:06:26.575454 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-tgvqz" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="registry-server" probeResult="failure" output=< Dec 03 11:06:26 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:06:26 crc kubenswrapper[4702]: > Dec 03 11:06:28 crc kubenswrapper[4702]: I1203 11:06:28.238924 4702 generic.go:334] "Generic (PLEG): container finished" podID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerID="6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73" exitCode=0 Dec 03 11:06:28 crc kubenswrapper[4702]: I1203 11:06:28.239008 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9wb" event={"ID":"b806ad42-5c69-4ea6-8d36-fb54595132bf","Type":"ContainerDied","Data":"6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73"} Dec 03 11:06:31 crc kubenswrapper[4702]: I1203 11:06:31.261977 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7dh" event={"ID":"085dd40d-8d1f-40ce-903b-fbed55010a29","Type":"ContainerStarted","Data":"1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840"} Dec 03 11:06:32 crc kubenswrapper[4702]: I1203 11:06:32.292046 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wv7dh" podStartSLOduration=5.139496727 podStartE2EDuration="1m19.292013573s" podCreationTimestamp="2025-12-03 11:05:13 +0000 UTC" firstStartedPulling="2025-12-03 11:05:15.44310719 +0000 UTC m=+99.279035654" lastFinishedPulling="2025-12-03 11:06:29.595624046 +0000 UTC m=+173.431552500" observedRunningTime="2025-12-03 11:06:32.289533556 +0000 UTC m=+176.125462020" watchObservedRunningTime="2025-12-03 11:06:32.292013573 +0000 UTC m=+176.127942027" Dec 03 11:06:33 crc kubenswrapper[4702]: I1203 11:06:33.475081 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:06:33 crc kubenswrapper[4702]: I1203 11:06:33.475159 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:06:33 crc kubenswrapper[4702]: I1203 11:06:33.514011 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:06:33 crc kubenswrapper[4702]: I1203 11:06:33.924125 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:06:33 crc kubenswrapper[4702]: I1203 11:06:33.974703 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:06:35 crc kubenswrapper[4702]: I1203 11:06:35.572570 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:06:35 crc kubenswrapper[4702]: I1203 11:06:35.622587 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:06:36 crc kubenswrapper[4702]: I1203 11:06:36.277279 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvlz6"] Dec 03 11:06:36 crc kubenswrapper[4702]: I1203 11:06:36.277963 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rvlz6" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="registry-server" containerID="cri-o://81ebee0789e812500222883ba92d4c81c7ad73375a3efff52cd8378f395bcd91" gracePeriod=2 Dec 03 11:06:37 crc kubenswrapper[4702]: I1203 11:06:37.304609 4702 generic.go:334] "Generic (PLEG): container finished" podID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerID="81ebee0789e812500222883ba92d4c81c7ad73375a3efff52cd8378f395bcd91" exitCode=0 Dec 03 11:06:37 crc kubenswrapper[4702]: I1203 11:06:37.304668 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlz6" event={"ID":"0447b485-fa0f-470b-abb6-da0f406d0f7f","Type":"ContainerDied","Data":"81ebee0789e812500222883ba92d4c81c7ad73375a3efff52cd8378f395bcd91"} Dec 03 11:06:38 crc kubenswrapper[4702]: I1203 11:06:38.319699 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpqqw" event={"ID":"eb339a22-530c-412c-8d5a-8f9c56ab096b","Type":"ContainerStarted","Data":"7a5e4713e65a9c6940743c60ee58adda56cd4a749f3a260a88d03b9a44ac24e2"} Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.149607 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.158700 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-catalog-content\") pod \"0447b485-fa0f-470b-abb6-da0f406d0f7f\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.158854 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-utilities\") pod \"0447b485-fa0f-470b-abb6-da0f406d0f7f\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.158919 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvxt9\" (UniqueName: \"kubernetes.io/projected/0447b485-fa0f-470b-abb6-da0f406d0f7f-kube-api-access-fvxt9\") pod \"0447b485-fa0f-470b-abb6-da0f406d0f7f\" (UID: \"0447b485-fa0f-470b-abb6-da0f406d0f7f\") " Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.162327 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-utilities" (OuterVolumeSpecName: "utilities") pod "0447b485-fa0f-470b-abb6-da0f406d0f7f" (UID: "0447b485-fa0f-470b-abb6-da0f406d0f7f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.167040 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0447b485-fa0f-470b-abb6-da0f406d0f7f-kube-api-access-fvxt9" (OuterVolumeSpecName: "kube-api-access-fvxt9") pod "0447b485-fa0f-470b-abb6-da0f406d0f7f" (UID: "0447b485-fa0f-470b-abb6-da0f406d0f7f"). InnerVolumeSpecName "kube-api-access-fvxt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.209643 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0447b485-fa0f-470b-abb6-da0f406d0f7f" (UID: "0447b485-fa0f-470b-abb6-da0f406d0f7f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.262112 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.262171 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0447b485-fa0f-470b-abb6-da0f406d0f7f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.262191 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvxt9\" (UniqueName: \"kubernetes.io/projected/0447b485-fa0f-470b-abb6-da0f406d0f7f-kube-api-access-fvxt9\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.328827 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlz6" event={"ID":"0447b485-fa0f-470b-abb6-da0f406d0f7f","Type":"ContainerDied","Data":"3334c1d9605018f202e9fe8c8560a079532b11ecd1985583373ed5c969f7dc37"} Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.328945 4702 scope.go:117] "RemoveContainer" containerID="81ebee0789e812500222883ba92d4c81c7ad73375a3efff52cd8378f395bcd91" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.328965 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlz6" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.361418 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fpqqw" podStartSLOduration=5.307095231 podStartE2EDuration="1m23.361373643s" podCreationTimestamp="2025-12-03 11:05:16 +0000 UTC" firstStartedPulling="2025-12-03 11:05:18.736208322 +0000 UTC m=+102.572136786" lastFinishedPulling="2025-12-03 11:06:36.790486734 +0000 UTC m=+180.626415198" observedRunningTime="2025-12-03 11:06:39.361052274 +0000 UTC m=+183.196980738" watchObservedRunningTime="2025-12-03 11:06:39.361373643 +0000 UTC m=+183.197302117" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.379313 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvlz6"] Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.384900 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rvlz6"] Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.438660 4702 scope.go:117] "RemoveContainer" containerID="36b0bbfac2157e64e634410d7b5e02ebfe13b7288fc8ac3ecee8943520740602" Dec 03 11:06:39 crc kubenswrapper[4702]: I1203 11:06:39.464171 4702 scope.go:117] "RemoveContainer" containerID="fe93cc5fb352da55f03be118476e73f7002395876016d2234a843826b2409df2" Dec 03 11:06:40 crc kubenswrapper[4702]: I1203 11:06:40.936857 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" path="/var/lib/kubelet/pods/0447b485-fa0f-470b-abb6-da0f406d0f7f/volumes" Dec 03 11:06:41 crc kubenswrapper[4702]: I1203 11:06:41.355557 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9wb" event={"ID":"b806ad42-5c69-4ea6-8d36-fb54595132bf","Type":"ContainerStarted","Data":"108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30"} Dec 03 11:06:41 crc kubenswrapper[4702]: I1203 11:06:41.383790 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sn9wb" podStartSLOduration=3.540522427 podStartE2EDuration="1m25.38374325s" podCreationTimestamp="2025-12-03 11:05:16 +0000 UTC" firstStartedPulling="2025-12-03 11:05:17.595114859 +0000 UTC m=+101.431043323" lastFinishedPulling="2025-12-03 11:06:39.438335682 +0000 UTC m=+183.274264146" observedRunningTime="2025-12-03 11:06:41.380514274 +0000 UTC m=+185.216442738" watchObservedRunningTime="2025-12-03 11:06:41.38374325 +0000 UTC m=+185.219671714" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.645908 4702 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646368 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646390 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646404 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646412 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646429 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="extract-utilities" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646437 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="extract-utilities" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646446 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerName="extract-utilities" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646454 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerName="extract-utilities" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646466 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646477 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646485 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="extract-content" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646492 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="extract-content" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646506 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerName="extract-utilities" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646513 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerName="extract-utilities" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646526 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerName="extract-content" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646534 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerName="extract-content" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646547 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerName="extract-content" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646554 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerName="extract-content" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.646571 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="113264ad-a66d-41d5-95a7-0fd255036eee" containerName="pruner" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646578 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="113264ad-a66d-41d5-95a7-0fd255036eee" containerName="pruner" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646722 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="74352ae3-f7cb-4e98-9f31-d0df7f9a9baf" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646741 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="68547a12-2568-4a7a-a3e2-fde07134e6ee" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646773 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="113264ad-a66d-41d5-95a7-0fd255036eee" containerName="pruner" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.646786 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="0447b485-fa0f-470b-abb6-da0f406d0f7f" containerName="registry-server" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.647369 4702 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.647609 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.647873 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d" gracePeriod=15 Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.647947 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e" gracePeriod=15 Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.648014 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624" gracePeriod=15 Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.647948 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4" gracePeriod=15 Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.648105 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8" gracePeriod=15 Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.648622 4702 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.649382 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.649444 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.649480 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.649529 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.649554 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.649569 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.649617 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.649634 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.649659 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.649671 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.649721 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.649736 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 11:06:42 crc kubenswrapper[4702]: E1203 11:06:42.649752 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.649806 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.650174 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.650198 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.650253 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.650270 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.650298 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.651321 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.710334 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.812858 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.812917 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.812943 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.812965 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.814824 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.814891 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.814960 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.814996 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917181 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917246 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917275 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917299 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917331 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917357 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917382 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917408 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917409 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917501 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917447 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917545 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917515 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917580 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917599 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:42 crc kubenswrapper[4702]: I1203 11:06:42.917639 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.011850 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:06:43 crc kubenswrapper[4702]: W1203 11:06:43.033976 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5f78fd9fc3202df783874231315afe95effa1c8f66341d59ca5591d7270d7449 WatchSource:0}: Error finding container 5f78fd9fc3202df783874231315afe95effa1c8f66341d59ca5591d7270d7449: Status 404 returned error can't find the container with id 5f78fd9fc3202df783874231315afe95effa1c8f66341d59ca5591d7270d7449 Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.370636 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0f97d7beefb3189f3f0d686b73106b54c7a2f7ddf73b1f188b09227207f71c80"} Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.371204 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5f78fd9fc3202df783874231315afe95effa1c8f66341d59ca5591d7270d7449"} Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.375837 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.377473 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.378782 4702 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8" exitCode=0 Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.378826 4702 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e" exitCode=0 Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.378837 4702 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4" exitCode=0 Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.378851 4702 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624" exitCode=2 Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.378924 4702 scope.go:117] "RemoveContainer" containerID="47a5c5c8db23ac1e1628bc04607ee44fc6b131bfcd5714ecd542609e44661f5e" Dec 03 11:06:43 crc kubenswrapper[4702]: I1203 11:06:43.522389 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:06:44 crc kubenswrapper[4702]: I1203 11:06:44.390217 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 11:06:45 crc kubenswrapper[4702]: I1203 11:06:45.173401 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 11:06:46 crc kubenswrapper[4702]: I1203 11:06:46.480107 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:06:46 crc kubenswrapper[4702]: I1203 11:06:46.480293 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:06:46 crc kubenswrapper[4702]: I1203 11:06:46.521674 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:06:46 crc kubenswrapper[4702]: I1203 11:06:46.863165 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:06:46 crc kubenswrapper[4702]: I1203 11:06:46.863264 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:06:46 crc kubenswrapper[4702]: I1203 11:06:46.910795 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:06:47 crc kubenswrapper[4702]: I1203 11:06:47.455093 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:06:47 crc kubenswrapper[4702]: I1203 11:06:47.477292 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:06:48 crc kubenswrapper[4702]: E1203 11:06:48.039790 4702 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dafdf66d64e89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 11:06:43.036991113 +0000 UTC m=+186.872919577,LastTimestamp:2025-12-03 11:06:43.036991113 +0000 UTC m=+186.872919577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.378592 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.382334 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.382799 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.383374 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.383772 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.384205 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.385168 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.385447 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.385902 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.386203 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.386501 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.416466 4702 generic.go:334] "Generic (PLEG): container finished" podID="bc99dba0-244c-46eb-b4fb-b08679d5431b" containerID="e356ba97f9a0df021090dfe9f8ad51b2fc373f71d56e6aa29b57bfbed203334b" exitCode=0 Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.416549 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bc99dba0-244c-46eb-b4fb-b08679d5431b","Type":"ContainerDied","Data":"e356ba97f9a0df021090dfe9f8ad51b2fc373f71d56e6aa29b57bfbed203334b"} Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.418399 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.418927 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.422303 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.423227 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.423805 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:48 crc kubenswrapper[4702]: I1203 11:06:48.424304 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.693623 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.694957 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.695749 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.696366 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.696706 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.697081 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.697457 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.826979 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc99dba0-244c-46eb-b4fb-b08679d5431b-kube-api-access\") pod \"bc99dba0-244c-46eb-b4fb-b08679d5431b\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.827367 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-var-lock\") pod \"bc99dba0-244c-46eb-b4fb-b08679d5431b\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.827434 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-kubelet-dir\") pod \"bc99dba0-244c-46eb-b4fb-b08679d5431b\" (UID: \"bc99dba0-244c-46eb-b4fb-b08679d5431b\") " Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.827613 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-var-lock" (OuterVolumeSpecName: "var-lock") pod "bc99dba0-244c-46eb-b4fb-b08679d5431b" (UID: "bc99dba0-244c-46eb-b4fb-b08679d5431b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.827624 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc99dba0-244c-46eb-b4fb-b08679d5431b" (UID: "bc99dba0-244c-46eb-b4fb-b08679d5431b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.838043 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc99dba0-244c-46eb-b4fb-b08679d5431b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc99dba0-244c-46eb-b4fb-b08679d5431b" (UID: "bc99dba0-244c-46eb-b4fb-b08679d5431b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.929441 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc99dba0-244c-46eb-b4fb-b08679d5431b-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.929474 4702 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:49 crc kubenswrapper[4702]: I1203 11:06:49.929483 4702 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc99dba0-244c-46eb-b4fb-b08679d5431b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.438154 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.439461 4702 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d" exitCode=0 Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.441351 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"bc99dba0-244c-46eb-b4fb-b08679d5431b","Type":"ContainerDied","Data":"91086eb5a2078d57314414da08612e7acf017096a87197b6e55e3fb6148c789d"} Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.441385 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91086eb5a2078d57314414da08612e7acf017096a87197b6e55e3fb6148c789d" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.441420 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.453996 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.454327 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.454632 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.454889 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.455142 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.455496 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.648693 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.650011 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.650571 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.650940 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.651149 4702 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.651648 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.652254 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.652514 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.652831 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.848223 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.848367 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.848435 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.848511 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.848631 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.848857 4702 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.848878 4702 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.849046 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.939433 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 11:06:50 crc kubenswrapper[4702]: E1203 11:06:50.950167 4702 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.176:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" volumeName="registry-storage" Dec 03 11:06:50 crc kubenswrapper[4702]: I1203 11:06:50.950952 4702 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:06:51 crc kubenswrapper[4702]: E1203 11:06:51.172507 4702 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: E1203 11:06:51.173401 4702 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: E1203 11:06:51.173889 4702 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: E1203 11:06:51.174458 4702 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: E1203 11:06:51.175001 4702 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.175107 4702 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 11:06:51 crc kubenswrapper[4702]: E1203 11:06:51.175678 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="200ms" Dec 03 11:06:51 crc kubenswrapper[4702]: E1203 11:06:51.376786 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="400ms" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.451658 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.452581 4702 scope.go:117] "RemoveContainer" containerID="2189fedbef0fdbccfffa4e82aad4f6cef8a4d7545553dac4295ba465cfe533b8" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.452684 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.454487 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.455458 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.456218 4702 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.456508 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.456820 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.457374 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.457940 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.458382 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.458641 4702 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.458959 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.459271 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.460027 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.460383 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.461116 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.470063 4702 scope.go:117] "RemoveContainer" containerID="4565d5b292bba2ea4974765a3f255adca1d4f85afd0ee6d0fe9f80290613351e" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.486171 4702 scope.go:117] "RemoveContainer" containerID="61d2fe1029779a001610ed9211be5f74c921447fd66b58694411eaf252cf95c4" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.504207 4702 scope.go:117] "RemoveContainer" containerID="194fad99f60ef6c27e417b29c59df4127442a07253881a1d5d39d7fee2111624" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.521001 4702 scope.go:117] "RemoveContainer" containerID="744fe6b7e6ba566e7394210015a3367937d38f49cabb16ba81ecd345d85af71d" Dec 03 11:06:51 crc kubenswrapper[4702]: I1203 11:06:51.537924 4702 scope.go:117] "RemoveContainer" containerID="e217a53607f28f2ff45eb84be7d72978bd6c9366670945a36471fe86964c00f0" Dec 03 11:06:51 crc kubenswrapper[4702]: E1203 11:06:51.777942 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="800ms" Dec 03 11:06:52 crc kubenswrapper[4702]: E1203 11:06:52.020368 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:06:47Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:06:47Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:06:47Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T11:06:47Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:416a7dc57b2b95775e679e0ab93111baaa063e55a4c6d73856a248d85a2debbd\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:5cdd97eed164a2eda9842fb91d284f06d2e63d69af9a98001fca2d6cebd0b52a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1609873225},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:2041f196d215ed63fc3218d5a0e0cb8409dd5b575612d999cffc9817ca7ee2c4\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:9f4887ff6cc4a4c64b413cf3ffbbceababd44911f3d138b5dc10dcb622ccd23c\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1204906320},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:dba4a8e0293f7b6e1459b74484a7126274bdc9efa75f808eb15e5a3896a3c818\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:eee64597300e249e4bead6abda4d235cc4fc3a87be82e3e9f582609602ed87d7\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1201319250},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e8990432556acad31519b1a73ec32f32d27c2034cf9e5cc4db8980efc7331594\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ebe9f523f5c211a3a0f2570331dddcd5be15b12c1fecd9b8b121f881bfaad029\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1129027903},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:52 crc kubenswrapper[4702]: E1203 11:06:52.021118 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:52 crc kubenswrapper[4702]: E1203 11:06:52.021500 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:52 crc kubenswrapper[4702]: E1203 11:06:52.021858 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:52 crc kubenswrapper[4702]: E1203 11:06:52.022183 4702 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:52 crc kubenswrapper[4702]: E1203 11:06:52.022209 4702 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 11:06:52 crc kubenswrapper[4702]: E1203 11:06:52.578980 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="1.6s" Dec 03 11:06:54 crc kubenswrapper[4702]: E1203 11:06:54.180836 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="3.2s" Dec 03 11:06:54 crc kubenswrapper[4702]: E1203 11:06:54.429198 4702 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.176:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dafdf66d64e89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 11:06:43.036991113 +0000 UTC m=+186.872919577,LastTimestamp:2025-12-03 11:06:43.036991113 +0000 UTC m=+186.872919577,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 11:06:55 crc kubenswrapper[4702]: I1203 11:06:55.908264 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:06:55 crc kubenswrapper[4702]: I1203 11:06:55.908379 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.927381 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.930914 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.931455 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.931790 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.932296 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.932692 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.933030 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.933512 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.933845 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.934169 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.934480 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.934807 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.935242 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.960181 4702 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.960293 4702 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:06:56 crc kubenswrapper[4702]: E1203 11:06:56.960897 4702 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:56 crc kubenswrapper[4702]: I1203 11:06:56.961688 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:56 crc kubenswrapper[4702]: W1203 11:06:56.989693 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-265136d56c7f92f855e91c86683a129200e7b25b2bd7a91c3f1fd8bb802ad903 WatchSource:0}: Error finding container 265136d56c7f92f855e91c86683a129200e7b25b2bd7a91c3f1fd8bb802ad903: Status 404 returned error can't find the container with id 265136d56c7f92f855e91c86683a129200e7b25b2bd7a91c3f1fd8bb802ad903 Dec 03 11:06:57 crc kubenswrapper[4702]: E1203 11:06:57.381958 4702 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.176:6443: connect: connection refused" interval="6.4s" Dec 03 11:06:57 crc kubenswrapper[4702]: I1203 11:06:57.496837 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"265136d56c7f92f855e91c86683a129200e7b25b2bd7a91c3f1fd8bb802ad903"} Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.505601 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.505697 4702 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132" exitCode=1 Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.505814 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132"} Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.506860 4702 scope.go:117] "RemoveContainer" containerID="012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.507166 4702 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.507458 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.507851 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.508461 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.508568 4702 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="2565322f4f5d9c96521a88d4758e20414882cc3594075851d55e36e53703f304" exitCode=0 Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.508604 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"2565322f4f5d9c96521a88d4758e20414882cc3594075851d55e36e53703f304"} Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.508838 4702 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.508855 4702 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.509020 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: E1203 11:06:58.509271 4702 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.509315 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.510385 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.510876 4702 status_manager.go:851] "Failed to get status for pod" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" pod="openshift-marketplace/redhat-operators-fpqqw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fpqqw\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.511440 4702 status_manager.go:851] "Failed to get status for pod" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" pod="openshift-marketplace/certified-operators-wv7dh" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-wv7dh\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.511747 4702 status_manager.go:851] "Failed to get status for pod" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" pod="openshift-marketplace/redhat-operators-sn9wb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-sn9wb\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.512050 4702 status_manager.go:851] "Failed to get status for pod" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" pod="openshift-network-diagnostics/network-check-target-xd92c" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/pods/network-check-target-xd92c\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.512333 4702 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.512869 4702 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:58 crc kubenswrapper[4702]: I1203 11:06:58.513195 4702 status_manager.go:851] "Failed to get status for pod" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.176:6443: connect: connection refused" Dec 03 11:06:59 crc kubenswrapper[4702]: I1203 11:06:59.551436 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 11:06:59 crc kubenswrapper[4702]: I1203 11:06:59.552458 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"340fda11da7dc96a53d9de3d8081f19331c382e2d43554b23914b4431b55cf34"} Dec 03 11:06:59 crc kubenswrapper[4702]: I1203 11:06:59.558986 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1a4667f0f78fa5fc1852bd2b64a0101a62bbaa53c4ef0547a160b296165d5b86"} Dec 03 11:06:59 crc kubenswrapper[4702]: I1203 11:06:59.559058 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f808d7afbd37c672eb97fc5bc2581aacdbffb0bd82d7f290525039d432a3b8e3"} Dec 03 11:07:00 crc kubenswrapper[4702]: I1203 11:07:00.569077 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a74ce179b0481cb48da16669bdb6db91f197bca7076fa71a1a660cd733319ce2"} Dec 03 11:07:00 crc kubenswrapper[4702]: I1203 11:07:00.569152 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ad3c4efc467af0fa13c9247e05ce226dbd9c4b1ce8026e2eebdae97f4f823051"} Dec 03 11:07:00 crc kubenswrapper[4702]: I1203 11:07:00.569167 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"39e1287abb66def71eff9befc2b81c62a96d81e4bb65f96957c872891d94f8b1"} Dec 03 11:07:00 crc kubenswrapper[4702]: I1203 11:07:00.570137 4702 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:07:00 crc kubenswrapper[4702]: I1203 11:07:00.570164 4702 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:07:00 crc kubenswrapper[4702]: I1203 11:07:00.570677 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:07:01 crc kubenswrapper[4702]: I1203 11:07:01.962544 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:07:01 crc kubenswrapper[4702]: I1203 11:07:01.962617 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:07:01 crc kubenswrapper[4702]: I1203 11:07:01.969462 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:07:02 crc kubenswrapper[4702]: I1203 11:07:02.050599 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:07:02 crc kubenswrapper[4702]: I1203 11:07:02.055355 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:07:02 crc kubenswrapper[4702]: I1203 11:07:02.582198 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:07:05 crc kubenswrapper[4702]: I1203 11:07:05.637478 4702 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:07:06 crc kubenswrapper[4702]: I1203 11:07:06.605839 4702 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:07:06 crc kubenswrapper[4702]: I1203 11:07:06.605892 4702 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:07:06 crc kubenswrapper[4702]: I1203 11:07:06.611584 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:07:06 crc kubenswrapper[4702]: I1203 11:07:06.975441 4702 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fe639266-150c-4683-9292-7c27ea2eb24f" Dec 03 11:07:07 crc kubenswrapper[4702]: I1203 11:07:07.612134 4702 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:07:07 crc kubenswrapper[4702]: I1203 11:07:07.613114 4702 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="ea51e23b-8c79-4010-a539-e0f35cceefde" Dec 03 11:07:07 crc kubenswrapper[4702]: I1203 11:07:07.615988 4702 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fe639266-150c-4683-9292-7c27ea2eb24f" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.038875 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.062635 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.109228 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.124381 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.201177 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.244988 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.313046 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.415172 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.453938 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.561966 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.647248 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.835602 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 11:07:12 crc kubenswrapper[4702]: I1203 11:07:12.867660 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 11:07:13 crc kubenswrapper[4702]: I1203 11:07:13.012131 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 11:07:13 crc kubenswrapper[4702]: I1203 11:07:13.386731 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 11:07:13 crc kubenswrapper[4702]: I1203 11:07:13.449339 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 11:07:13 crc kubenswrapper[4702]: I1203 11:07:13.641521 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 11:07:13 crc kubenswrapper[4702]: I1203 11:07:13.684811 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 11:07:13 crc kubenswrapper[4702]: I1203 11:07:13.753194 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 11:07:13 crc kubenswrapper[4702]: I1203 11:07:13.800154 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 11:07:13 crc kubenswrapper[4702]: I1203 11:07:13.917699 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 11:07:14 crc kubenswrapper[4702]: I1203 11:07:14.335705 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 11:07:14 crc kubenswrapper[4702]: I1203 11:07:14.356800 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 11:07:14 crc kubenswrapper[4702]: I1203 11:07:14.453021 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 11:07:14 crc kubenswrapper[4702]: I1203 11:07:14.496641 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 11:07:14 crc kubenswrapper[4702]: I1203 11:07:14.509968 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 11:07:14 crc kubenswrapper[4702]: I1203 11:07:14.632952 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 11:07:14 crc kubenswrapper[4702]: I1203 11:07:14.700866 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 11:07:14 crc kubenswrapper[4702]: I1203 11:07:14.749808 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.100457 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.209126 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.212141 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.322679 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.545039 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.626099 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.653894 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.816433 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.825136 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 11:07:15 crc kubenswrapper[4702]: I1203 11:07:15.848970 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.206996 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.274887 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.280127 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.326562 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.332019 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.373170 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.638593 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.665307 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.671523 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.777631 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 11:07:16 crc kubenswrapper[4702]: I1203 11:07:16.974026 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.003947 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.015280 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.159101 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.205947 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.480784 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.490059 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.526820 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.540875 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.605712 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.643463 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.682055 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.731253 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.742131 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 11:07:17 crc kubenswrapper[4702]: I1203 11:07:17.842189 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.009641 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.076168 4702 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.129657 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.229423 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.261079 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.318731 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.646043 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.805562 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 11:07:18 crc kubenswrapper[4702]: I1203 11:07:18.920809 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 11:07:19 crc kubenswrapper[4702]: I1203 11:07:19.044583 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 11:07:19 crc kubenswrapper[4702]: I1203 11:07:19.191941 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 11:07:19 crc kubenswrapper[4702]: I1203 11:07:19.192955 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 11:07:19 crc kubenswrapper[4702]: I1203 11:07:19.406896 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 11:07:19 crc kubenswrapper[4702]: I1203 11:07:19.616674 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 11:07:19 crc kubenswrapper[4702]: I1203 11:07:19.964202 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.002492 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.027124 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.043616 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.091521 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.187330 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.377732 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.420100 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.586511 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.636902 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.749790 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 11:07:20 crc kubenswrapper[4702]: I1203 11:07:20.895157 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.016747 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.111487 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.222673 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.356517 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.390324 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.397615 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.689048 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.721497 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 11:07:21 crc kubenswrapper[4702]: I1203 11:07:21.724834 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.068277 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.136499 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.137636 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.153059 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.236565 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.256397 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.357517 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.531714 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.683948 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.752813 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.801587 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 11:07:22 crc kubenswrapper[4702]: I1203 11:07:22.884054 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.031387 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.074059 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.098963 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.105361 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.237396 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.593454 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.705398 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.712827 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.857596 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.886905 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 11:07:23 crc kubenswrapper[4702]: I1203 11:07:23.892637 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.007221 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.029155 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.383312 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.396377 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.409731 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.504789 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.715337 4702 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.852383 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.895401 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.929805 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.977412 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.992244 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 11:07:24 crc kubenswrapper[4702]: I1203 11:07:24.992683 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.000420 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.089917 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.184857 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.342861 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.355011 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.365009 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.373197 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.397197 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.427370 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.478628 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.483783 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.552673 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.614802 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.712893 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.763386 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.796924 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.908083 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.908544 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.908598 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.909223 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:07:25 crc kubenswrapper[4702]: I1203 11:07:25.909303 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4" gracePeriod=600 Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.144930 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.169818 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.199371 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.279541 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.303624 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.614294 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.632010 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.672316 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.705064 4702 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.729742 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.733933 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.735519 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4" exitCode=0 Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.735616 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4"} Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.735808 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"1fc7c6ca3be1bf16313736990d6f512ba61818b84fe4574e5b246fb1305c4999"} Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.747152 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.796930 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.945667 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.987950 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 11:07:26 crc kubenswrapper[4702]: I1203 11:07:26.997513 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.154132 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.159957 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.243375 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.346783 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.353673 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.382693 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.430567 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.465990 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.523272 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.535725 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.536449 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.539961 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.560196 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.761422 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.817834 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.827041 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.869521 4702 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.874106 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.874071844 podStartE2EDuration="45.874071844s" podCreationTimestamp="2025-12-03 11:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:07:05.818725748 +0000 UTC m=+209.654654212" watchObservedRunningTime="2025-12-03 11:07:27.874071844 +0000 UTC m=+231.710000358" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.878668 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.878737 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.885820 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.904110 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.904082556 podStartE2EDuration="22.904082556s" podCreationTimestamp="2025-12-03 11:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:07:27.898312236 +0000 UTC m=+231.734240710" watchObservedRunningTime="2025-12-03 11:07:27.904082556 +0000 UTC m=+231.740011020" Dec 03 11:07:27 crc kubenswrapper[4702]: I1203 11:07:27.928093 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.031854 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.054826 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.087783 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.218942 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.290193 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.414816 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.450864 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.599635 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.613705 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.615425 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.812174 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.917954 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.940935 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.948017 4702 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 11:07:28 crc kubenswrapper[4702]: I1203 11:07:28.953228 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.017569 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.101000 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.147028 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.211212 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.346532 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.374202 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.385397 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.394626 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.507001 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.722588 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.918871 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.936915 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.960467 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 11:07:29 crc kubenswrapper[4702]: I1203 11:07:29.965149 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.061659 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.224591 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.359000 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.372132 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.535934 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.639619 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.642504 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.672403 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.912592 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 11:07:30 crc kubenswrapper[4702]: I1203 11:07:30.991315 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.001306 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.025600 4702 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.096059 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.158315 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.185317 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.197913 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.502651 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.525564 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 11:07:31 crc kubenswrapper[4702]: I1203 11:07:31.592340 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.012276 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.062357 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.097547 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.121492 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.277649 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.285294 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.349702 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.690531 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.893903 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 11:07:32 crc kubenswrapper[4702]: I1203 11:07:32.940194 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 11:07:33 crc kubenswrapper[4702]: I1203 11:07:33.447406 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 11:07:33 crc kubenswrapper[4702]: I1203 11:07:33.535107 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 11:07:33 crc kubenswrapper[4702]: I1203 11:07:33.854379 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 11:07:33 crc kubenswrapper[4702]: I1203 11:07:33.866344 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 11:07:34 crc kubenswrapper[4702]: I1203 11:07:34.009897 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 11:07:34 crc kubenswrapper[4702]: I1203 11:07:34.128632 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 11:07:34 crc kubenswrapper[4702]: I1203 11:07:34.146830 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 11:07:34 crc kubenswrapper[4702]: I1203 11:07:34.210308 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 11:07:34 crc kubenswrapper[4702]: I1203 11:07:34.802034 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 11:07:35 crc kubenswrapper[4702]: I1203 11:07:35.448004 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 11:07:38 crc kubenswrapper[4702]: I1203 11:07:38.448834 4702 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 11:07:38 crc kubenswrapper[4702]: I1203 11:07:38.449874 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://0f97d7beefb3189f3f0d686b73106b54c7a2f7ddf73b1f188b09227207f71c80" gracePeriod=5 Dec 03 11:07:43 crc kubenswrapper[4702]: I1203 11:07:43.845556 4702 generic.go:334] "Generic (PLEG): container finished" podID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerID="8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436" exitCode=0 Dec 03 11:07:43 crc kubenswrapper[4702]: I1203 11:07:43.845645 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" event={"ID":"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac","Type":"ContainerDied","Data":"8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436"} Dec 03 11:07:43 crc kubenswrapper[4702]: I1203 11:07:43.847140 4702 scope.go:117] "RemoveContainer" containerID="8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436" Dec 03 11:07:43 crc kubenswrapper[4702]: I1203 11:07:43.853906 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 11:07:43 crc kubenswrapper[4702]: I1203 11:07:43.853983 4702 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="0f97d7beefb3189f3f0d686b73106b54c7a2f7ddf73b1f188b09227207f71c80" exitCode=137 Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.053205 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.054189 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.234931 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.235358 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.235488 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.235127 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.235692 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.235887 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.236017 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.235919 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.236065 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.236505 4702 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.236578 4702 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.236655 4702 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.236719 4702 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.251028 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.337776 4702 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.863337 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.863496 4702 scope.go:117] "RemoveContainer" containerID="0f97d7beefb3189f3f0d686b73106b54c7a2f7ddf73b1f188b09227207f71c80" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.863562 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.867416 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" event={"ID":"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac","Type":"ContainerStarted","Data":"90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397"} Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.868664 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.871369 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.945110 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.945389 4702 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.962495 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.962562 4702 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="08483b0a-6ba5-4092-a017-ec689af78ab2" Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.962606 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 11:07:44 crc kubenswrapper[4702]: I1203 11:07:44.962619 4702 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="08483b0a-6ba5-4092-a017-ec689af78ab2" Dec 03 11:08:06 crc kubenswrapper[4702]: I1203 11:08:06.474651 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-njlsm"] Dec 03 11:08:06 crc kubenswrapper[4702]: I1203 11:08:06.475879 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" podUID="77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" containerName="controller-manager" containerID="cri-o://1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a" gracePeriod=30 Dec 03 11:08:06 crc kubenswrapper[4702]: I1203 11:08:06.576548 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk"] Dec 03 11:08:06 crc kubenswrapper[4702]: I1203 11:08:06.576931 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" podUID="be3d3f8d-f407-4b54-8e9b-a5b526babb52" containerName="route-controller-manager" containerID="cri-o://ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824" gracePeriod=30 Dec 03 11:08:06 crc kubenswrapper[4702]: I1203 11:08:06.941214 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:08:06 crc kubenswrapper[4702]: I1203 11:08:06.994839 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.033663 4702 generic.go:334] "Generic (PLEG): container finished" podID="77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" containerID="1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a" exitCode=0 Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.033807 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" event={"ID":"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1","Type":"ContainerDied","Data":"1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a"} Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.033858 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" event={"ID":"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1","Type":"ContainerDied","Data":"7fbd20a467293a2fb302b138dbcecaaf3433a9d09dffe8ad04200aaaf07bfd86"} Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.033888 4702 scope.go:117] "RemoveContainer" containerID="1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.034240 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-njlsm" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.036714 4702 generic.go:334] "Generic (PLEG): container finished" podID="be3d3f8d-f407-4b54-8e9b-a5b526babb52" containerID="ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824" exitCode=0 Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.036765 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" event={"ID":"be3d3f8d-f407-4b54-8e9b-a5b526babb52","Type":"ContainerDied","Data":"ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824"} Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.036799 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" event={"ID":"be3d3f8d-f407-4b54-8e9b-a5b526babb52","Type":"ContainerDied","Data":"9235ef0f9c65cc4eb3dc8d9645aeec798b539946d44c9abfcff3b1e85a226017"} Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.036877 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.056261 4702 scope.go:117] "RemoveContainer" containerID="1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a" Dec 03 11:08:07 crc kubenswrapper[4702]: E1203 11:08:07.057098 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a\": container with ID starting with 1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a not found: ID does not exist" containerID="1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.057157 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a"} err="failed to get container status \"1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a\": rpc error: code = NotFound desc = could not find container \"1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a\": container with ID starting with 1709b7faa112102cc0f55f767d64a49310f085082b2191380a8df5c39b01996a not found: ID does not exist" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.057196 4702 scope.go:117] "RemoveContainer" containerID="ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.071630 4702 scope.go:117] "RemoveContainer" containerID="ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824" Dec 03 11:08:07 crc kubenswrapper[4702]: E1203 11:08:07.072248 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824\": container with ID starting with ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824 not found: ID does not exist" containerID="ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.072275 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824"} err="failed to get container status \"ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824\": rpc error: code = NotFound desc = could not find container \"ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824\": container with ID starting with ea1ff2051c0064c65a742815bd90820931416cbb630d35ba60434803d3bae824 not found: ID does not exist" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.090057 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-serving-cert\") pod \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.090169 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4vqb\" (UniqueName: \"kubernetes.io/projected/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-kube-api-access-x4vqb\") pod \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.090275 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-config\") pod \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.090542 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-proxy-ca-bundles\") pod \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.090602 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-client-ca\") pod \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\" (UID: \"77ca3da7-003a-49b5-9a2d-21ee53e6b5f1\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.092224 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" (UID: "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.092236 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-client-ca" (OuterVolumeSpecName: "client-ca") pod "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" (UID: "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.092310 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-config" (OuterVolumeSpecName: "config") pod "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" (UID: "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.098088 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-kube-api-access-x4vqb" (OuterVolumeSpecName: "kube-api-access-x4vqb") pod "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" (UID: "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1"). InnerVolumeSpecName "kube-api-access-x4vqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.098131 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" (UID: "77ca3da7-003a-49b5-9a2d-21ee53e6b5f1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192164 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d3f8d-f407-4b54-8e9b-a5b526babb52-serving-cert\") pod \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192262 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-client-ca\") pod \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192323 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-config\") pod \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192403 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jstnx\" (UniqueName: \"kubernetes.io/projected/be3d3f8d-f407-4b54-8e9b-a5b526babb52-kube-api-access-jstnx\") pod \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\" (UID: \"be3d3f8d-f407-4b54-8e9b-a5b526babb52\") " Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192880 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192903 4702 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192917 4702 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192928 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.192942 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4vqb\" (UniqueName: \"kubernetes.io/projected/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1-kube-api-access-x4vqb\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.193725 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-client-ca" (OuterVolumeSpecName: "client-ca") pod "be3d3f8d-f407-4b54-8e9b-a5b526babb52" (UID: "be3d3f8d-f407-4b54-8e9b-a5b526babb52"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.193931 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-config" (OuterVolumeSpecName: "config") pod "be3d3f8d-f407-4b54-8e9b-a5b526babb52" (UID: "be3d3f8d-f407-4b54-8e9b-a5b526babb52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.197020 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3d3f8d-f407-4b54-8e9b-a5b526babb52-kube-api-access-jstnx" (OuterVolumeSpecName: "kube-api-access-jstnx") pod "be3d3f8d-f407-4b54-8e9b-a5b526babb52" (UID: "be3d3f8d-f407-4b54-8e9b-a5b526babb52"). InnerVolumeSpecName "kube-api-access-jstnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.198333 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3d3f8d-f407-4b54-8e9b-a5b526babb52-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "be3d3f8d-f407-4b54-8e9b-a5b526babb52" (UID: "be3d3f8d-f407-4b54-8e9b-a5b526babb52"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.294533 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be3d3f8d-f407-4b54-8e9b-a5b526babb52-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.294596 4702 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.294608 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be3d3f8d-f407-4b54-8e9b-a5b526babb52-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.294622 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jstnx\" (UniqueName: \"kubernetes.io/projected/be3d3f8d-f407-4b54-8e9b-a5b526babb52-kube-api-access-jstnx\") on node \"crc\" DevicePath \"\"" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.419461 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-njlsm"] Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.426710 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-njlsm"] Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.438782 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk"] Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.443558 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-s62lk"] Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.911390 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s"] Dec 03 11:08:07 crc kubenswrapper[4702]: E1203 11:08:07.911746 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" containerName="controller-manager" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.911783 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" containerName="controller-manager" Dec 03 11:08:07 crc kubenswrapper[4702]: E1203 11:08:07.911825 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.911832 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 11:08:07 crc kubenswrapper[4702]: E1203 11:08:07.911847 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" containerName="installer" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.911855 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" containerName="installer" Dec 03 11:08:07 crc kubenswrapper[4702]: E1203 11:08:07.911863 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3d3f8d-f407-4b54-8e9b-a5b526babb52" containerName="route-controller-manager" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.911871 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3d3f8d-f407-4b54-8e9b-a5b526babb52" containerName="route-controller-manager" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.911988 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3d3f8d-f407-4b54-8e9b-a5b526babb52" containerName="route-controller-manager" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.912012 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" containerName="controller-manager" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.912026 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.912036 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc99dba0-244c-46eb-b4fb-b08679d5431b" containerName="installer" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.912614 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.916040 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6"] Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.916090 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.916309 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.916496 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.916512 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.916581 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.916892 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.920921 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.927670 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.928390 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.928550 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.928580 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.928646 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.928426 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.939085 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s"] Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.941458 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 11:08:07 crc kubenswrapper[4702]: I1203 11:08:07.956013 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6"] Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005106 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4wfh\" (UniqueName: \"kubernetes.io/projected/bf5605cf-63ed-46d8-9350-4fc022355f49-kube-api-access-w4wfh\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005183 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-proxy-ca-bundles\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005234 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-config\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005258 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-config\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005298 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d7vw\" (UniqueName: \"kubernetes.io/projected/5d41c80c-e115-4dca-be74-1b235a204a33-kube-api-access-2d7vw\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005317 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-client-ca\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005595 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-client-ca\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005730 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5605cf-63ed-46d8-9350-4fc022355f49-serving-cert\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.005790 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d41c80c-e115-4dca-be74-1b235a204a33-serving-cert\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106635 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-proxy-ca-bundles\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106710 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-config\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106735 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-config\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106784 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d7vw\" (UniqueName: \"kubernetes.io/projected/5d41c80c-e115-4dca-be74-1b235a204a33-kube-api-access-2d7vw\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106806 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-client-ca\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106846 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-client-ca\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106875 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5605cf-63ed-46d8-9350-4fc022355f49-serving-cert\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106896 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d41c80c-e115-4dca-be74-1b235a204a33-serving-cert\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.106938 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4wfh\" (UniqueName: \"kubernetes.io/projected/bf5605cf-63ed-46d8-9350-4fc022355f49-kube-api-access-w4wfh\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.108353 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-proxy-ca-bundles\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.108899 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-client-ca\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.109085 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-config\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.111422 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-client-ca\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.112894 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d41c80c-e115-4dca-be74-1b235a204a33-serving-cert\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.113189 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5605cf-63ed-46d8-9350-4fc022355f49-serving-cert\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.119922 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-config\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.125238 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4wfh\" (UniqueName: \"kubernetes.io/projected/bf5605cf-63ed-46d8-9350-4fc022355f49-kube-api-access-w4wfh\") pod \"route-controller-manager-5c8b9fb8fb-v647s\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.127504 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d7vw\" (UniqueName: \"kubernetes.io/projected/5d41c80c-e115-4dca-be74-1b235a204a33-kube-api-access-2d7vw\") pod \"controller-manager-c7bb99f5d-ppmq6\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.250306 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.259248 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.499532 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s"] Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.525860 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6"] Dec 03 11:08:08 crc kubenswrapper[4702]: W1203 11:08:08.533338 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d41c80c_e115_4dca_be74_1b235a204a33.slice/crio-892aed34ad7b18761626326a8e9913746875389eb15da8ac486a48b6ac11c86c WatchSource:0}: Error finding container 892aed34ad7b18761626326a8e9913746875389eb15da8ac486a48b6ac11c86c: Status 404 returned error can't find the container with id 892aed34ad7b18761626326a8e9913746875389eb15da8ac486a48b6ac11c86c Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.937090 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77ca3da7-003a-49b5-9a2d-21ee53e6b5f1" path="/var/lib/kubelet/pods/77ca3da7-003a-49b5-9a2d-21ee53e6b5f1/volumes" Dec 03 11:08:08 crc kubenswrapper[4702]: I1203 11:08:08.939946 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be3d3f8d-f407-4b54-8e9b-a5b526babb52" path="/var/lib/kubelet/pods/be3d3f8d-f407-4b54-8e9b-a5b526babb52/volumes" Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.054496 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" event={"ID":"5d41c80c-e115-4dca-be74-1b235a204a33","Type":"ContainerStarted","Data":"aa0a6b3f9df0284399847471af7d8bfb530f97de2a1e82df29cc92967ca6e7d4"} Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.054565 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" event={"ID":"5d41c80c-e115-4dca-be74-1b235a204a33","Type":"ContainerStarted","Data":"892aed34ad7b18761626326a8e9913746875389eb15da8ac486a48b6ac11c86c"} Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.057385 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.059560 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" event={"ID":"bf5605cf-63ed-46d8-9350-4fc022355f49","Type":"ContainerStarted","Data":"ac6f36cf7eec75b2814916cdd16b5d50574c40c85292b96b026b1d71ec998f07"} Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.059613 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" event={"ID":"bf5605cf-63ed-46d8-9350-4fc022355f49","Type":"ContainerStarted","Data":"4d83164a744a5bdbc9ab797f9091ba26655db7c06b9d6a2d8c1048c7d26cd16e"} Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.061494 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.063220 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.100348 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" podStartSLOduration=3.100317986 podStartE2EDuration="3.100317986s" podCreationTimestamp="2025-12-03 11:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:08:09.080792204 +0000 UTC m=+272.916720668" watchObservedRunningTime="2025-12-03 11:08:09.100317986 +0000 UTC m=+272.936246450" Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.133720 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" podStartSLOduration=3.1336916009999998 podStartE2EDuration="3.133691601s" podCreationTimestamp="2025-12-03 11:08:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:08:09.133600448 +0000 UTC m=+272.969528902" watchObservedRunningTime="2025-12-03 11:08:09.133691601 +0000 UTC m=+272.969620065" Dec 03 11:08:09 crc kubenswrapper[4702]: I1203 11:08:09.209974 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:08:36 crc kubenswrapper[4702]: I1203 11:08:36.643312 4702 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Dec 03 11:08:53 crc kubenswrapper[4702]: I1203 11:08:53.196378 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5ghh"] Dec 03 11:09:06 crc kubenswrapper[4702]: I1203 11:09:06.467672 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6"] Dec 03 11:09:06 crc kubenswrapper[4702]: I1203 11:09:06.468776 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" podUID="5d41c80c-e115-4dca-be74-1b235a204a33" containerName="controller-manager" containerID="cri-o://aa0a6b3f9df0284399847471af7d8bfb530f97de2a1e82df29cc92967ca6e7d4" gracePeriod=30 Dec 03 11:09:06 crc kubenswrapper[4702]: I1203 11:09:06.484186 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s"] Dec 03 11:09:06 crc kubenswrapper[4702]: I1203 11:09:06.484900 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" podUID="bf5605cf-63ed-46d8-9350-4fc022355f49" containerName="route-controller-manager" containerID="cri-o://ac6f36cf7eec75b2814916cdd16b5d50574c40c85292b96b026b1d71ec998f07" gracePeriod=30 Dec 03 11:09:06 crc kubenswrapper[4702]: I1203 11:09:06.706101 4702 generic.go:334] "Generic (PLEG): container finished" podID="5d41c80c-e115-4dca-be74-1b235a204a33" containerID="aa0a6b3f9df0284399847471af7d8bfb530f97de2a1e82df29cc92967ca6e7d4" exitCode=0 Dec 03 11:09:06 crc kubenswrapper[4702]: I1203 11:09:06.706209 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" event={"ID":"5d41c80c-e115-4dca-be74-1b235a204a33","Type":"ContainerDied","Data":"aa0a6b3f9df0284399847471af7d8bfb530f97de2a1e82df29cc92967ca6e7d4"} Dec 03 11:09:06 crc kubenswrapper[4702]: I1203 11:09:06.708414 4702 generic.go:334] "Generic (PLEG): container finished" podID="bf5605cf-63ed-46d8-9350-4fc022355f49" containerID="ac6f36cf7eec75b2814916cdd16b5d50574c40c85292b96b026b1d71ec998f07" exitCode=0 Dec 03 11:09:06 crc kubenswrapper[4702]: I1203 11:09:06.708482 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" event={"ID":"bf5605cf-63ed-46d8-9350-4fc022355f49","Type":"ContainerDied","Data":"ac6f36cf7eec75b2814916cdd16b5d50574c40c85292b96b026b1d71ec998f07"} Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.414245 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fpqqw"] Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.415110 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fpqqw" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerName="registry-server" containerID="cri-o://7a5e4713e65a9c6940743c60ee58adda56cd4a749f3a260a88d03b9a44ac24e2" gracePeriod=2 Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.449344 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.455227 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.474551 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-config\") pod \"bf5605cf-63ed-46d8-9350-4fc022355f49\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.474636 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d7vw\" (UniqueName: \"kubernetes.io/projected/5d41c80c-e115-4dca-be74-1b235a204a33-kube-api-access-2d7vw\") pod \"5d41c80c-e115-4dca-be74-1b235a204a33\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.474712 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-proxy-ca-bundles\") pod \"5d41c80c-e115-4dca-be74-1b235a204a33\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.476784 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4wfh\" (UniqueName: \"kubernetes.io/projected/bf5605cf-63ed-46d8-9350-4fc022355f49-kube-api-access-w4wfh\") pod \"bf5605cf-63ed-46d8-9350-4fc022355f49\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.476864 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-config\") pod \"5d41c80c-e115-4dca-be74-1b235a204a33\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.477034 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-client-ca\") pod \"5d41c80c-e115-4dca-be74-1b235a204a33\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.477273 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d41c80c-e115-4dca-be74-1b235a204a33-serving-cert\") pod \"5d41c80c-e115-4dca-be74-1b235a204a33\" (UID: \"5d41c80c-e115-4dca-be74-1b235a204a33\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.477310 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5605cf-63ed-46d8-9350-4fc022355f49-serving-cert\") pod \"bf5605cf-63ed-46d8-9350-4fc022355f49\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.477421 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-client-ca\") pod \"bf5605cf-63ed-46d8-9350-4fc022355f49\" (UID: \"bf5605cf-63ed-46d8-9350-4fc022355f49\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.476105 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-config" (OuterVolumeSpecName: "config") pod "bf5605cf-63ed-46d8-9350-4fc022355f49" (UID: "bf5605cf-63ed-46d8-9350-4fc022355f49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.478827 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf5605cf-63ed-46d8-9350-4fc022355f49" (UID: "bf5605cf-63ed-46d8-9350-4fc022355f49"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.482783 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5d41c80c-e115-4dca-be74-1b235a204a33" (UID: "5d41c80c-e115-4dca-be74-1b235a204a33"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.484431 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-client-ca" (OuterVolumeSpecName: "client-ca") pod "5d41c80c-e115-4dca-be74-1b235a204a33" (UID: "5d41c80c-e115-4dca-be74-1b235a204a33"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.484550 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-config" (OuterVolumeSpecName: "config") pod "5d41c80c-e115-4dca-be74-1b235a204a33" (UID: "5d41c80c-e115-4dca-be74-1b235a204a33"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.485526 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5605cf-63ed-46d8-9350-4fc022355f49-kube-api-access-w4wfh" (OuterVolumeSpecName: "kube-api-access-w4wfh") pod "bf5605cf-63ed-46d8-9350-4fc022355f49" (UID: "bf5605cf-63ed-46d8-9350-4fc022355f49"). InnerVolumeSpecName "kube-api-access-w4wfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.485827 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5605cf-63ed-46d8-9350-4fc022355f49-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf5605cf-63ed-46d8-9350-4fc022355f49" (UID: "bf5605cf-63ed-46d8-9350-4fc022355f49"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.487008 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d41c80c-e115-4dca-be74-1b235a204a33-kube-api-access-2d7vw" (OuterVolumeSpecName: "kube-api-access-2d7vw") pod "5d41c80c-e115-4dca-be74-1b235a204a33" (UID: "5d41c80c-e115-4dca-be74-1b235a204a33"). InnerVolumeSpecName "kube-api-access-2d7vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.487640 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d41c80c-e115-4dca-be74-1b235a204a33-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5d41c80c-e115-4dca-be74-1b235a204a33" (UID: "5d41c80c-e115-4dca-be74-1b235a204a33"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579227 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579273 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d7vw\" (UniqueName: \"kubernetes.io/projected/5d41c80c-e115-4dca-be74-1b235a204a33-kube-api-access-2d7vw\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579291 4702 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579301 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4wfh\" (UniqueName: \"kubernetes.io/projected/bf5605cf-63ed-46d8-9350-4fc022355f49-kube-api-access-w4wfh\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579311 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579319 4702 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d41c80c-e115-4dca-be74-1b235a204a33-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579328 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d41c80c-e115-4dca-be74-1b235a204a33-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579336 4702 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5605cf-63ed-46d8-9350-4fc022355f49-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.579344 4702 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf5605cf-63ed-46d8-9350-4fc022355f49-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.721596 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" event={"ID":"5d41c80c-e115-4dca-be74-1b235a204a33","Type":"ContainerDied","Data":"892aed34ad7b18761626326a8e9913746875389eb15da8ac486a48b6ac11c86c"} Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.722360 4702 scope.go:117] "RemoveContainer" containerID="aa0a6b3f9df0284399847471af7d8bfb530f97de2a1e82df29cc92967ca6e7d4" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.722824 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.727137 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" event={"ID":"bf5605cf-63ed-46d8-9350-4fc022355f49","Type":"ContainerDied","Data":"4d83164a744a5bdbc9ab797f9091ba26655db7c06b9d6a2d8c1048c7d26cd16e"} Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.727256 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.737897 4702 generic.go:334] "Generic (PLEG): container finished" podID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerID="7a5e4713e65a9c6940743c60ee58adda56cd4a749f3a260a88d03b9a44ac24e2" exitCode=0 Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.737928 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpqqw" event={"ID":"eb339a22-530c-412c-8d5a-8f9c56ab096b","Type":"ContainerDied","Data":"7a5e4713e65a9c6940743c60ee58adda56cd4a749f3a260a88d03b9a44ac24e2"} Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.748834 4702 scope.go:117] "RemoveContainer" containerID="ac6f36cf7eec75b2814916cdd16b5d50574c40c85292b96b026b1d71ec998f07" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.763917 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6"] Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.768670 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-c7bb99f5d-ppmq6"] Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.786112 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s"] Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.788531 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5c8b9fb8fb-v647s"] Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.836380 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.885486 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqs27\" (UniqueName: \"kubernetes.io/projected/eb339a22-530c-412c-8d5a-8f9c56ab096b-kube-api-access-fqs27\") pod \"eb339a22-530c-412c-8d5a-8f9c56ab096b\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.885561 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-utilities\") pod \"eb339a22-530c-412c-8d5a-8f9c56ab096b\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.885608 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-catalog-content\") pod \"eb339a22-530c-412c-8d5a-8f9c56ab096b\" (UID: \"eb339a22-530c-412c-8d5a-8f9c56ab096b\") " Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.887692 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-utilities" (OuterVolumeSpecName: "utilities") pod "eb339a22-530c-412c-8d5a-8f9c56ab096b" (UID: "eb339a22-530c-412c-8d5a-8f9c56ab096b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.892627 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb339a22-530c-412c-8d5a-8f9c56ab096b-kube-api-access-fqs27" (OuterVolumeSpecName: "kube-api-access-fqs27") pod "eb339a22-530c-412c-8d5a-8f9c56ab096b" (UID: "eb339a22-530c-412c-8d5a-8f9c56ab096b"). InnerVolumeSpecName "kube-api-access-fqs27". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959147 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd"] Dec 03 11:09:07 crc kubenswrapper[4702]: E1203 11:09:07.959639 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerName="registry-server" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959665 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerName="registry-server" Dec 03 11:09:07 crc kubenswrapper[4702]: E1203 11:09:07.959689 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerName="extract-utilities" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959696 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerName="extract-utilities" Dec 03 11:09:07 crc kubenswrapper[4702]: E1203 11:09:07.959710 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5605cf-63ed-46d8-9350-4fc022355f49" containerName="route-controller-manager" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959718 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5605cf-63ed-46d8-9350-4fc022355f49" containerName="route-controller-manager" Dec 03 11:09:07 crc kubenswrapper[4702]: E1203 11:09:07.959732 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d41c80c-e115-4dca-be74-1b235a204a33" containerName="controller-manager" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959739 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d41c80c-e115-4dca-be74-1b235a204a33" containerName="controller-manager" Dec 03 11:09:07 crc kubenswrapper[4702]: E1203 11:09:07.959769 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerName="extract-content" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959776 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerName="extract-content" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959887 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5605cf-63ed-46d8-9350-4fc022355f49" containerName="route-controller-manager" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959900 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d41c80c-e115-4dca-be74-1b235a204a33" containerName="controller-manager" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.959911 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" containerName="registry-server" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.960519 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.963835 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.966865 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.966927 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.967032 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.967134 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.967477 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.973076 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-548478b8dd-9254p"] Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.974713 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.979023 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.979372 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.979556 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.979625 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.980002 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.980671 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.987190 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ssxb\" (UniqueName: \"kubernetes.io/projected/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-kube-api-access-4ssxb\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.987272 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-config\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.988501 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-client-ca\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.988566 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-serving-cert\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.988679 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqs27\" (UniqueName: \"kubernetes.io/projected/eb339a22-530c-412c-8d5a-8f9c56ab096b-kube-api-access-fqs27\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.988745 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.988803 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-548478b8dd-9254p"] Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.993210 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 11:09:07 crc kubenswrapper[4702]: I1203 11:09:07.993326 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd"] Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.028579 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb339a22-530c-412c-8d5a-8f9c56ab096b" (UID: "eb339a22-530c-412c-8d5a-8f9c56ab096b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091049 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-client-ca\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091207 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-config\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091260 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-client-ca\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091285 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-serving-cert\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091304 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fecafd1a-bd80-46ea-8839-dbfc2d364a96-serving-cert\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091340 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzsnx\" (UniqueName: \"kubernetes.io/projected/fecafd1a-bd80-46ea-8839-dbfc2d364a96-kube-api-access-vzsnx\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091380 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-config\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091408 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-proxy-ca-bundles\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091440 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ssxb\" (UniqueName: \"kubernetes.io/projected/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-kube-api-access-4ssxb\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.091483 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb339a22-530c-412c-8d5a-8f9c56ab096b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.092610 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-client-ca\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.093090 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-config\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.096670 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-serving-cert\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.109711 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ssxb\" (UniqueName: \"kubernetes.io/projected/e3e08d4a-20c7-430a-a3d9-988d64e6a6b4-kube-api-access-4ssxb\") pod \"route-controller-manager-84cf75c7c5-96cmd\" (UID: \"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4\") " pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.193314 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-config\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.193380 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-proxy-ca-bundles\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.193449 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-client-ca\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.193504 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fecafd1a-bd80-46ea-8839-dbfc2d364a96-serving-cert\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.193529 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzsnx\" (UniqueName: \"kubernetes.io/projected/fecafd1a-bd80-46ea-8839-dbfc2d364a96-kube-api-access-vzsnx\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.194899 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-client-ca\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.195544 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-proxy-ca-bundles\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.196150 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fecafd1a-bd80-46ea-8839-dbfc2d364a96-config\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.197823 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fecafd1a-bd80-46ea-8839-dbfc2d364a96-serving-cert\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.214412 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzsnx\" (UniqueName: \"kubernetes.io/projected/fecafd1a-bd80-46ea-8839-dbfc2d364a96-kube-api-access-vzsnx\") pod \"controller-manager-548478b8dd-9254p\" (UID: \"fecafd1a-bd80-46ea-8839-dbfc2d364a96\") " pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.289788 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.301701 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.572453 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-548478b8dd-9254p"] Dec 03 11:09:08 crc kubenswrapper[4702]: W1203 11:09:08.730193 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e08d4a_20c7_430a_a3d9_988d64e6a6b4.slice/crio-bdbc3fbadaaadfc001463bc1f667373dc2a2fb269229eaf385570dba31d314c3 WatchSource:0}: Error finding container bdbc3fbadaaadfc001463bc1f667373dc2a2fb269229eaf385570dba31d314c3: Status 404 returned error can't find the container with id bdbc3fbadaaadfc001463bc1f667373dc2a2fb269229eaf385570dba31d314c3 Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.731291 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd"] Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.745540 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" event={"ID":"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4","Type":"ContainerStarted","Data":"bdbc3fbadaaadfc001463bc1f667373dc2a2fb269229eaf385570dba31d314c3"} Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.762388 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpqqw" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.762652 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpqqw" event={"ID":"eb339a22-530c-412c-8d5a-8f9c56ab096b","Type":"ContainerDied","Data":"691efbc93d85d643d5ec1eef619c643f7ea59e6fb3e7fa439d39062a35c148d6"} Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.763349 4702 scope.go:117] "RemoveContainer" containerID="7a5e4713e65a9c6940743c60ee58adda56cd4a749f3a260a88d03b9a44ac24e2" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.768301 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" event={"ID":"fecafd1a-bd80-46ea-8839-dbfc2d364a96","Type":"ContainerStarted","Data":"498cbcafc4576968069a9524978a00fe463e429dcf91b5f85630fda236bf0391"} Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.768576 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.770135 4702 patch_prober.go:28] interesting pod/controller-manager-548478b8dd-9254p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.770214 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.786738 4702 scope.go:117] "RemoveContainer" containerID="b443e21f51852a32594c3c8931400625a6d6890bfbcde1e322af6494d9c3e78d" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.810111 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podStartSLOduration=2.810086322 podStartE2EDuration="2.810086322s" podCreationTimestamp="2025-12-03 11:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:09:08.806702076 +0000 UTC m=+332.642630560" watchObservedRunningTime="2025-12-03 11:09:08.810086322 +0000 UTC m=+332.646014806" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.831360 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fpqqw"] Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.838181 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fpqqw"] Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.844242 4702 scope.go:117] "RemoveContainer" containerID="a129c7b8d2bd8cdedb556a73ca49a884a02c0d63b5e8661d280b9f9444e9e2b2" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.937924 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d41c80c-e115-4dca-be74-1b235a204a33" path="/var/lib/kubelet/pods/5d41c80c-e115-4dca-be74-1b235a204a33/volumes" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.939998 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5605cf-63ed-46d8-9350-4fc022355f49" path="/var/lib/kubelet/pods/bf5605cf-63ed-46d8-9350-4fc022355f49/volumes" Dec 03 11:09:08 crc kubenswrapper[4702]: I1203 11:09:08.940611 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb339a22-530c-412c-8d5a-8f9c56ab096b" path="/var/lib/kubelet/pods/eb339a22-530c-412c-8d5a-8f9c56ab096b/volumes" Dec 03 11:09:09 crc kubenswrapper[4702]: I1203 11:09:09.777797 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" event={"ID":"fecafd1a-bd80-46ea-8839-dbfc2d364a96","Type":"ContainerStarted","Data":"97a0cf5f74bb303168d42c86dbc2d9abc4f6775d6c7ffd7df4510df0fb0a6fc2"} Dec 03 11:09:09 crc kubenswrapper[4702]: I1203 11:09:09.780797 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" event={"ID":"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4","Type":"ContainerStarted","Data":"4263520eca4d7e37a135b540dd1b3cfa0adf6e70430cc456be62501d5508459b"} Dec 03 11:09:09 crc kubenswrapper[4702]: I1203 11:09:09.781006 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:09 crc kubenswrapper[4702]: I1203 11:09:09.783514 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 11:09:09 crc kubenswrapper[4702]: I1203 11:09:09.786366 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 11:09:09 crc kubenswrapper[4702]: I1203 11:09:09.825585 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podStartSLOduration=3.8255474229999997 podStartE2EDuration="3.825547423s" podCreationTimestamp="2025-12-03 11:09:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:09:09.804170308 +0000 UTC m=+333.640098802" watchObservedRunningTime="2025-12-03 11:09:09.825547423 +0000 UTC m=+333.661475887" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.720688 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8p7q4"] Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.721813 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.749495 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8p7q4"] Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.752398 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-bound-sa-token\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.752479 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11c38a7f-7709-4e96-b309-c987c2301610-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.752545 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.752589 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11c38a7f-7709-4e96-b309-c987c2301610-trusted-ca\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.752640 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82vb5\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-kube-api-access-82vb5\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.752674 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-registry-tls\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.752748 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11c38a7f-7709-4e96-b309-c987c2301610-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.752806 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11c38a7f-7709-4e96-b309-c987c2301610-registry-certificates\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.781911 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.854563 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-bound-sa-token\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.854627 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11c38a7f-7709-4e96-b309-c987c2301610-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.854677 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11c38a7f-7709-4e96-b309-c987c2301610-trusted-ca\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.854718 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82vb5\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-kube-api-access-82vb5\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.854792 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-registry-tls\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.854876 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11c38a7f-7709-4e96-b309-c987c2301610-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.854911 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11c38a7f-7709-4e96-b309-c987c2301610-registry-certificates\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.855865 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11c38a7f-7709-4e96-b309-c987c2301610-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.856443 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11c38a7f-7709-4e96-b309-c987c2301610-registry-certificates\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.857570 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11c38a7f-7709-4e96-b309-c987c2301610-trusted-ca\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.864092 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11c38a7f-7709-4e96-b309-c987c2301610-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.865582 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-registry-tls\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.874618 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82vb5\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-kube-api-access-82vb5\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:11 crc kubenswrapper[4702]: I1203 11:09:11.876646 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11c38a7f-7709-4e96-b309-c987c2301610-bound-sa-token\") pod \"image-registry-66df7c8f76-8p7q4\" (UID: \"11c38a7f-7709-4e96-b309-c987c2301610\") " pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:12 crc kubenswrapper[4702]: I1203 11:09:12.043404 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:12 crc kubenswrapper[4702]: I1203 11:09:12.491849 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8p7q4"] Dec 03 11:09:12 crc kubenswrapper[4702]: I1203 11:09:12.803005 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" event={"ID":"11c38a7f-7709-4e96-b309-c987c2301610","Type":"ContainerStarted","Data":"405aade30a4ba8cbebc37273f1526c8dc87b7b5e42fbd2a6c970715bbb7c6e90"} Dec 03 11:09:12 crc kubenswrapper[4702]: I1203 11:09:12.803415 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" event={"ID":"11c38a7f-7709-4e96-b309-c987c2301610","Type":"ContainerStarted","Data":"7135ec85e469dbbe50938c7250649d30aad68a0a7b06b6698590a9ed8a8ec172"} Dec 03 11:09:12 crc kubenswrapper[4702]: I1203 11:09:12.803442 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.231382 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" podUID="2c99e1fd-b0d0-418c-bb67-f638f06978f2" containerName="oauth-openshift" containerID="cri-o://47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f" gracePeriod=15 Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.744353 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.781516 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" podStartSLOduration=7.781468126 podStartE2EDuration="7.781468126s" podCreationTimestamp="2025-12-03 11:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:09:12.829074576 +0000 UTC m=+336.665003040" watchObservedRunningTime="2025-12-03 11:09:18.781468126 +0000 UTC m=+342.617396590" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.791284 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6548f7c795-c9kwd"] Dec 03 11:09:18 crc kubenswrapper[4702]: E1203 11:09:18.791687 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c99e1fd-b0d0-418c-bb67-f638f06978f2" containerName="oauth-openshift" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.791706 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c99e1fd-b0d0-418c-bb67-f638f06978f2" containerName="oauth-openshift" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.791869 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c99e1fd-b0d0-418c-bb67-f638f06978f2" containerName="oauth-openshift" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.792661 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.804745 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6548f7c795-c9kwd"] Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.846247 4702 generic.go:334] "Generic (PLEG): container finished" podID="2c99e1fd-b0d0-418c-bb67-f638f06978f2" containerID="47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f" exitCode=0 Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.846314 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" event={"ID":"2c99e1fd-b0d0-418c-bb67-f638f06978f2","Type":"ContainerDied","Data":"47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f"} Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.846401 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" event={"ID":"2c99e1fd-b0d0-418c-bb67-f638f06978f2","Type":"ContainerDied","Data":"78a64ebe5201bc8463e35c3756c37f28d62fba9956894c5dbb7fbeeee9158870"} Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.846429 4702 scope.go:117] "RemoveContainer" containerID="47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.846608 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-x5ghh" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.867054 4702 scope.go:117] "RemoveContainer" containerID="47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f" Dec 03 11:09:18 crc kubenswrapper[4702]: E1203 11:09:18.867542 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f\": container with ID starting with 47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f not found: ID does not exist" containerID="47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.867591 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f"} err="failed to get container status \"47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f\": rpc error: code = NotFound desc = could not find container \"47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f\": container with ID starting with 47242111bf043739cd815bd126838db9befac36e379f9cbcf882cd2ddd32d21f not found: ID does not exist" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875365 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-cliconfig\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875408 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-policies\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875502 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-serving-cert\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875548 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-provider-selection\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875575 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-service-ca\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875629 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-idp-0-file-data\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875666 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-login\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875702 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-error\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875750 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5qgm\" (UniqueName: \"kubernetes.io/projected/2c99e1fd-b0d0-418c-bb67-f638f06978f2-kube-api-access-r5qgm\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875845 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-dir\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875908 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-ocp-branding-template\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.875934 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-session\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.876268 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.876898 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.876955 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.877375 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.877651 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-trusted-ca-bundle\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.877991 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-router-certs\") pod \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\" (UID: \"2c99e1fd-b0d0-418c-bb67-f638f06978f2\") " Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878164 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878419 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878494 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-service-ca\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878535 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878576 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-audit-policies\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878598 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-router-certs\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878655 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878781 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htxrt\" (UniqueName: \"kubernetes.io/projected/0c75375b-08b0-4a81-adca-de576c8ff268-kube-api-access-htxrt\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878836 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-login\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878900 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.878999 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-session\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879055 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879108 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879138 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-error\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879284 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c75375b-08b0-4a81-adca-de576c8ff268-audit-dir\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879611 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879647 4702 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879673 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879691 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.879706 4702 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2c99e1fd-b0d0-418c-bb67-f638f06978f2-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.882937 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.883051 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c99e1fd-b0d0-418c-bb67-f638f06978f2-kube-api-access-r5qgm" (OuterVolumeSpecName: "kube-api-access-r5qgm") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "kube-api-access-r5qgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.883138 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.883331 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.883692 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.884186 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.884905 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.888023 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.888139 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "2c99e1fd-b0d0-418c-bb67-f638f06978f2" (UID: "2c99e1fd-b0d0-418c-bb67-f638f06978f2"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.980931 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981005 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-session\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981040 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981096 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981123 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-error\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981148 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c75375b-08b0-4a81-adca-de576c8ff268-audit-dir\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981213 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981267 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-service-ca\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981299 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981330 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-audit-policies\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981357 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-router-certs\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981392 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981443 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htxrt\" (UniqueName: \"kubernetes.io/projected/0c75375b-08b0-4a81-adca-de576c8ff268-kube-api-access-htxrt\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981470 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-login\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981532 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5qgm\" (UniqueName: \"kubernetes.io/projected/2c99e1fd-b0d0-418c-bb67-f638f06978f2-kube-api-access-r5qgm\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981550 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981565 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981579 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981593 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981610 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981628 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981645 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.981659 4702 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2c99e1fd-b0d0-418c-bb67-f638f06978f2-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.983622 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c75375b-08b0-4a81-adca-de576c8ff268-audit-dir\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.984149 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-audit-policies\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.985496 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.985690 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-service-ca\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.985708 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-error\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.985818 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-session\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.985886 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-router-certs\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.985896 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.986778 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.986914 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.987508 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.988417 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-user-template-login\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:18 crc kubenswrapper[4702]: I1203 11:09:18.988943 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c75375b-08b0-4a81-adca-de576c8ff268-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:19 crc kubenswrapper[4702]: I1203 11:09:19.001372 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htxrt\" (UniqueName: \"kubernetes.io/projected/0c75375b-08b0-4a81-adca-de576c8ff268-kube-api-access-htxrt\") pod \"oauth-openshift-6548f7c795-c9kwd\" (UID: \"0c75375b-08b0-4a81-adca-de576c8ff268\") " pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:19 crc kubenswrapper[4702]: I1203 11:09:19.113458 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:19 crc kubenswrapper[4702]: I1203 11:09:19.175343 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5ghh"] Dec 03 11:09:19 crc kubenswrapper[4702]: I1203 11:09:19.180182 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-x5ghh"] Dec 03 11:09:19 crc kubenswrapper[4702]: I1203 11:09:19.567880 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6548f7c795-c9kwd"] Dec 03 11:09:19 crc kubenswrapper[4702]: I1203 11:09:19.856357 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" event={"ID":"0c75375b-08b0-4a81-adca-de576c8ff268","Type":"ContainerStarted","Data":"82d4a7852cc4b8c591da6155d317404075ba10955b5ad451cbfdf264c7ddb9e3"} Dec 03 11:09:20 crc kubenswrapper[4702]: I1203 11:09:20.867334 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" event={"ID":"0c75375b-08b0-4a81-adca-de576c8ff268","Type":"ContainerStarted","Data":"ed23f0bb952245c24c0ac2a111b381a15a1284a0fcaa9a57735a4ef6e66d92d0"} Dec 03 11:09:20 crc kubenswrapper[4702]: I1203 11:09:20.868025 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:20 crc kubenswrapper[4702]: I1203 11:09:20.873745 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 11:09:20 crc kubenswrapper[4702]: I1203 11:09:20.892180 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podStartSLOduration=27.892153839 podStartE2EDuration="27.892153839s" podCreationTimestamp="2025-12-03 11:08:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:09:20.890946455 +0000 UTC m=+344.726874939" watchObservedRunningTime="2025-12-03 11:09:20.892153839 +0000 UTC m=+344.728082293" Dec 03 11:09:20 crc kubenswrapper[4702]: I1203 11:09:20.946896 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c99e1fd-b0d0-418c-bb67-f638f06978f2" path="/var/lib/kubelet/pods/2c99e1fd-b0d0-418c-bb67-f638f06978f2/volumes" Dec 03 11:09:32 crc kubenswrapper[4702]: I1203 11:09:32.049499 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" Dec 03 11:09:32 crc kubenswrapper[4702]: I1203 11:09:32.105265 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnnbr"] Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.017845 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wv7dh"] Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.020436 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wv7dh" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerName="registry-server" containerID="cri-o://1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840" gracePeriod=30 Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.037450 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jsndz"] Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.037881 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jsndz" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" containerName="registry-server" containerID="cri-o://eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510" gracePeriod=30 Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.047936 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ql9kq"] Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.048310 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" containerID="cri-o://90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397" gracePeriod=30 Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.068453 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgvqz"] Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.068861 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tgvqz" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="registry-server" containerID="cri-o://cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd" gracePeriod=30 Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.075714 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sn9wb"] Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.076117 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sn9wb" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerName="registry-server" containerID="cri-o://108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30" gracePeriod=30 Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.081181 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79n82"] Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.084869 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.102371 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jggt7\" (UniqueName: \"kubernetes.io/projected/f2c1609d-33a3-444f-9370-24495b15b3e0-kube-api-access-jggt7\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.102458 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f2c1609d-33a3-444f-9370-24495b15b3e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.102511 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2c1609d-33a3-444f-9370-24495b15b3e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.104735 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79n82"] Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.204366 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jggt7\" (UniqueName: \"kubernetes.io/projected/f2c1609d-33a3-444f-9370-24495b15b3e0-kube-api-access-jggt7\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.204859 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f2c1609d-33a3-444f-9370-24495b15b3e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.204894 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2c1609d-33a3-444f-9370-24495b15b3e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.207466 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2c1609d-33a3-444f-9370-24495b15b3e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.222117 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f2c1609d-33a3-444f-9370-24495b15b3e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.229367 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jggt7\" (UniqueName: \"kubernetes.io/projected/f2c1609d-33a3-444f-9370-24495b15b3e0-kube-api-access-jggt7\") pod \"marketplace-operator-79b997595-79n82\" (UID: \"f2c1609d-33a3-444f-9370-24495b15b3e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.533741 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.649433 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.716205 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-catalog-content\") pod \"085dd40d-8d1f-40ce-903b-fbed55010a29\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.716291 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g8s5\" (UniqueName: \"kubernetes.io/projected/085dd40d-8d1f-40ce-903b-fbed55010a29-kube-api-access-9g8s5\") pod \"085dd40d-8d1f-40ce-903b-fbed55010a29\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.716348 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-utilities\") pod \"085dd40d-8d1f-40ce-903b-fbed55010a29\" (UID: \"085dd40d-8d1f-40ce-903b-fbed55010a29\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.717772 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-utilities" (OuterVolumeSpecName: "utilities") pod "085dd40d-8d1f-40ce-903b-fbed55010a29" (UID: "085dd40d-8d1f-40ce-903b-fbed55010a29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.721875 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/085dd40d-8d1f-40ce-903b-fbed55010a29-kube-api-access-9g8s5" (OuterVolumeSpecName: "kube-api-access-9g8s5") pod "085dd40d-8d1f-40ce-903b-fbed55010a29" (UID: "085dd40d-8d1f-40ce-903b-fbed55010a29"). InnerVolumeSpecName "kube-api-access-9g8s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.790177 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "085dd40d-8d1f-40ce-903b-fbed55010a29" (UID: "085dd40d-8d1f-40ce-903b-fbed55010a29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.818486 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.818966 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g8s5\" (UniqueName: \"kubernetes.io/projected/085dd40d-8d1f-40ce-903b-fbed55010a29-kube-api-access-9g8s5\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.818992 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/085dd40d-8d1f-40ce-903b-fbed55010a29-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.879870 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.885268 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.907617 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920583 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-trusted-ca\") pod \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920694 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llwlk\" (UniqueName: \"kubernetes.io/projected/ab09df47-f81e-4b91-aee1-89e919c149ee-kube-api-access-llwlk\") pod \"ab09df47-f81e-4b91-aee1-89e919c149ee\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920721 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnt5j\" (UniqueName: \"kubernetes.io/projected/b806ad42-5c69-4ea6-8d36-fb54595132bf-kube-api-access-qnt5j\") pod \"b806ad42-5c69-4ea6-8d36-fb54595132bf\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920743 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qb4fg\" (UniqueName: \"kubernetes.io/projected/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-kube-api-access-qb4fg\") pod \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920790 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-utilities\") pod \"ab09df47-f81e-4b91-aee1-89e919c149ee\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920838 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-operator-metrics\") pod \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\" (UID: \"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920864 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-catalog-content\") pod \"ab09df47-f81e-4b91-aee1-89e919c149ee\" (UID: \"ab09df47-f81e-4b91-aee1-89e919c149ee\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920938 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-utilities\") pod \"b806ad42-5c69-4ea6-8d36-fb54595132bf\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.920979 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-catalog-content\") pod \"b806ad42-5c69-4ea6-8d36-fb54595132bf\" (UID: \"b806ad42-5c69-4ea6-8d36-fb54595132bf\") " Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.926899 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" (UID: "5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.927419 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-utilities" (OuterVolumeSpecName: "utilities") pod "ab09df47-f81e-4b91-aee1-89e919c149ee" (UID: "ab09df47-f81e-4b91-aee1-89e919c149ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.929270 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-utilities" (OuterVolumeSpecName: "utilities") pod "b806ad42-5c69-4ea6-8d36-fb54595132bf" (UID: "b806ad42-5c69-4ea6-8d36-fb54595132bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.932227 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b806ad42-5c69-4ea6-8d36-fb54595132bf-kube-api-access-qnt5j" (OuterVolumeSpecName: "kube-api-access-qnt5j") pod "b806ad42-5c69-4ea6-8d36-fb54595132bf" (UID: "b806ad42-5c69-4ea6-8d36-fb54595132bf"). InnerVolumeSpecName "kube-api-access-qnt5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.946990 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" (UID: "5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.947263 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab09df47-f81e-4b91-aee1-89e919c149ee-kube-api-access-llwlk" (OuterVolumeSpecName: "kube-api-access-llwlk") pod "ab09df47-f81e-4b91-aee1-89e919c149ee" (UID: "ab09df47-f81e-4b91-aee1-89e919c149ee"). InnerVolumeSpecName "kube-api-access-llwlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.948575 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-kube-api-access-qb4fg" (OuterVolumeSpecName: "kube-api-access-qb4fg") pod "5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" (UID: "5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac"). InnerVolumeSpecName "kube-api-access-qb4fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.958934 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:09:38 crc kubenswrapper[4702]: I1203 11:09:38.993459 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab09df47-f81e-4b91-aee1-89e919c149ee" (UID: "ab09df47-f81e-4b91-aee1-89e919c149ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.006684 4702 generic.go:334] "Generic (PLEG): container finished" podID="aefd671c-4583-4057-aebd-1c8c7931771f" containerID="eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510" exitCode=0 Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.006817 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsndz" event={"ID":"aefd671c-4583-4057-aebd-1c8c7931771f","Type":"ContainerDied","Data":"eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.006874 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jsndz" event={"ID":"aefd671c-4583-4057-aebd-1c8c7931771f","Type":"ContainerDied","Data":"0b3aff508753457253d53bff688c1c73660e4a63ca6006b9a8f80e7b516510df"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.006912 4702 scope.go:117] "RemoveContainer" containerID="eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.007159 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jsndz" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.015162 4702 generic.go:334] "Generic (PLEG): container finished" podID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerID="90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397" exitCode=0 Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.015283 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" event={"ID":"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac","Type":"ContainerDied","Data":"90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.015328 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" event={"ID":"5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac","Type":"ContainerDied","Data":"92a4c34d8167b72f696e7e0c2457a87c45e8b2b8bd8e011546f92f1f3489ec3b"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.015435 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ql9kq" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.021604 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-catalog-content\") pod \"aefd671c-4583-4057-aebd-1c8c7931771f\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.021694 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbhmz\" (UniqueName: \"kubernetes.io/projected/aefd671c-4583-4057-aebd-1c8c7931771f-kube-api-access-hbhmz\") pod \"aefd671c-4583-4057-aebd-1c8c7931771f\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.021807 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-utilities\") pod \"aefd671c-4583-4057-aebd-1c8c7931771f\" (UID: \"aefd671c-4583-4057-aebd-1c8c7931771f\") " Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022002 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llwlk\" (UniqueName: \"kubernetes.io/projected/ab09df47-f81e-4b91-aee1-89e919c149ee-kube-api-access-llwlk\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022016 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnt5j\" (UniqueName: \"kubernetes.io/projected/b806ad42-5c69-4ea6-8d36-fb54595132bf-kube-api-access-qnt5j\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022026 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qb4fg\" (UniqueName: \"kubernetes.io/projected/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-kube-api-access-qb4fg\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022036 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022046 4702 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022055 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab09df47-f81e-4b91-aee1-89e919c149ee-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022064 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022072 4702 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.022940 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-utilities" (OuterVolumeSpecName: "utilities") pod "aefd671c-4583-4057-aebd-1c8c7931771f" (UID: "aefd671c-4583-4057-aebd-1c8c7931771f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.030060 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aefd671c-4583-4057-aebd-1c8c7931771f-kube-api-access-hbhmz" (OuterVolumeSpecName: "kube-api-access-hbhmz") pod "aefd671c-4583-4057-aebd-1c8c7931771f" (UID: "aefd671c-4583-4057-aebd-1c8c7931771f"). InnerVolumeSpecName "kube-api-access-hbhmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.030224 4702 generic.go:334] "Generic (PLEG): container finished" podID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerID="cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd" exitCode=0 Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.030434 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgvqz" event={"ID":"ab09df47-f81e-4b91-aee1-89e919c149ee","Type":"ContainerDied","Data":"cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.030467 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tgvqz" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.030494 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tgvqz" event={"ID":"ab09df47-f81e-4b91-aee1-89e919c149ee","Type":"ContainerDied","Data":"b3b69ab3891875aa94288bf2ef24962cefab96af624035b19356bc287a6018cc"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.043857 4702 scope.go:117] "RemoveContainer" containerID="c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.044027 4702 generic.go:334] "Generic (PLEG): container finished" podID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerID="1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840" exitCode=0 Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.044281 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7dh" event={"ID":"085dd40d-8d1f-40ce-903b-fbed55010a29","Type":"ContainerDied","Data":"1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.045069 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wv7dh" event={"ID":"085dd40d-8d1f-40ce-903b-fbed55010a29","Type":"ContainerDied","Data":"d663ec0afac96b2dd413b6c838142fdc40f4f356a9475c53e4d546ef148669d2"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.045297 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wv7dh" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.053630 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ql9kq"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.060250 4702 generic.go:334] "Generic (PLEG): container finished" podID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerID="108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30" exitCode=0 Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.060340 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9wb" event={"ID":"b806ad42-5c69-4ea6-8d36-fb54595132bf","Type":"ContainerDied","Data":"108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.060403 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sn9wb" event={"ID":"b806ad42-5c69-4ea6-8d36-fb54595132bf","Type":"ContainerDied","Data":"997daa925690d08c1123ea85d5ec4a5586d0ac8810f0e88fbdd1a3adc5aaa02a"} Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.060535 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sn9wb" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.061438 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ql9kq"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.077341 4702 scope.go:117] "RemoveContainer" containerID="2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.081153 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wv7dh"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.091538 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wv7dh"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.116266 4702 scope.go:117] "RemoveContainer" containerID="eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.121931 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510\": container with ID starting with eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510 not found: ID does not exist" containerID="eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.121989 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510"} err="failed to get container status \"eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510\": rpc error: code = NotFound desc = could not find container \"eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510\": container with ID starting with eaba7e4210f58e873b63b66d9792fa6a77ebc20733ed7101fe6c3b1c1a89b510 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.122027 4702 scope.go:117] "RemoveContainer" containerID="c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.122354 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e\": container with ID starting with c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e not found: ID does not exist" containerID="c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.122393 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e"} err="failed to get container status \"c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e\": rpc error: code = NotFound desc = could not find container \"c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e\": container with ID starting with c4232908694f4e254e44e70b159c95a30f2836d5bf3c2e19c1011053a2b4047e not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.122412 4702 scope.go:117] "RemoveContainer" containerID="2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.122925 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbhmz\" (UniqueName: \"kubernetes.io/projected/aefd671c-4583-4057-aebd-1c8c7931771f-kube-api-access-hbhmz\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.122953 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.123194 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9\": container with ID starting with 2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9 not found: ID does not exist" containerID="2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.123216 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9"} err="failed to get container status \"2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9\": rpc error: code = NotFound desc = could not find container \"2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9\": container with ID starting with 2c53fd0a05b13f0849fdc4ed2ea82be2481b51eba2842c0bf986f27120eba5f9 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.123232 4702 scope.go:117] "RemoveContainer" containerID="90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.123735 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b806ad42-5c69-4ea6-8d36-fb54595132bf" (UID: "b806ad42-5c69-4ea6-8d36-fb54595132bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.133040 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgvqz"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.138302 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tgvqz"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.141935 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aefd671c-4583-4057-aebd-1c8c7931771f" (UID: "aefd671c-4583-4057-aebd-1c8c7931771f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.146868 4702 scope.go:117] "RemoveContainer" containerID="8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.166061 4702 scope.go:117] "RemoveContainer" containerID="90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.167002 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397\": container with ID starting with 90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397 not found: ID does not exist" containerID="90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.167089 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397"} err="failed to get container status \"90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397\": rpc error: code = NotFound desc = could not find container \"90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397\": container with ID starting with 90c8456ff399e325f9bb6e64ab7303e1dc7be432295e3e50672811b246d12397 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.167142 4702 scope.go:117] "RemoveContainer" containerID="8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.167705 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436\": container with ID starting with 8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436 not found: ID does not exist" containerID="8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.167730 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436"} err="failed to get container status \"8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436\": rpc error: code = NotFound desc = could not find container \"8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436\": container with ID starting with 8f3dc2c05062b2a0158389a759a07a63299d91fc5551494ccf9902ede78aa436 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.167747 4702 scope.go:117] "RemoveContainer" containerID="cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.212507 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79n82"] Dec 03 11:09:39 crc kubenswrapper[4702]: W1203 11:09:39.220401 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c1609d_33a3_444f_9370_24495b15b3e0.slice/crio-121c5ca2c6809282c63b073cf3203c0a20b8e3b10fcbc64ed13d8d7ceea15593 WatchSource:0}: Error finding container 121c5ca2c6809282c63b073cf3203c0a20b8e3b10fcbc64ed13d8d7ceea15593: Status 404 returned error can't find the container with id 121c5ca2c6809282c63b073cf3203c0a20b8e3b10fcbc64ed13d8d7ceea15593 Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.224472 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aefd671c-4583-4057-aebd-1c8c7931771f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.224522 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b806ad42-5c69-4ea6-8d36-fb54595132bf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.258865 4702 scope.go:117] "RemoveContainer" containerID="15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.290317 4702 scope.go:117] "RemoveContainer" containerID="5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.315544 4702 scope.go:117] "RemoveContainer" containerID="cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.316365 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd\": container with ID starting with cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd not found: ID does not exist" containerID="cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.316419 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd"} err="failed to get container status \"cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd\": rpc error: code = NotFound desc = could not find container \"cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd\": container with ID starting with cf9764f585379a0ca8d1e33531d673196cd28a77ff8f9a5cac941629860b06dd not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.316462 4702 scope.go:117] "RemoveContainer" containerID="15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.317129 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1\": container with ID starting with 15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1 not found: ID does not exist" containerID="15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.317164 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1"} err="failed to get container status \"15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1\": rpc error: code = NotFound desc = could not find container \"15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1\": container with ID starting with 15ab0124bfebc849851d03d8d94ebab6a9f37b3c4243ec8d7fe78ef09df344e1 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.317183 4702 scope.go:117] "RemoveContainer" containerID="5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.317665 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7\": container with ID starting with 5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7 not found: ID does not exist" containerID="5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.317774 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7"} err="failed to get container status \"5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7\": rpc error: code = NotFound desc = could not find container \"5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7\": container with ID starting with 5ccb3d123d5182bfc407c4e3bf17f905283c03620d33c2cb4cdc811a9adcdbf7 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.317827 4702 scope.go:117] "RemoveContainer" containerID="1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.365782 4702 scope.go:117] "RemoveContainer" containerID="08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.394279 4702 scope.go:117] "RemoveContainer" containerID="b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.402353 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jsndz"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.405935 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jsndz"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.419644 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sn9wb"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.425091 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sn9wb"] Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.430974 4702 scope.go:117] "RemoveContainer" containerID="1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.432541 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840\": container with ID starting with 1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840 not found: ID does not exist" containerID="1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.432662 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840"} err="failed to get container status \"1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840\": rpc error: code = NotFound desc = could not find container \"1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840\": container with ID starting with 1ad807b9e0da9076b57731a76e4692aac411f464302750b70ab80c0fa3ea8840 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.432714 4702 scope.go:117] "RemoveContainer" containerID="08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.433456 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040\": container with ID starting with 08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040 not found: ID does not exist" containerID="08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.433484 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040"} err="failed to get container status \"08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040\": rpc error: code = NotFound desc = could not find container \"08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040\": container with ID starting with 08088883276a26c81a101615c4ffffdc93de4c43ea048e2a6fa4acbf4e1ba040 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.433501 4702 scope.go:117] "RemoveContainer" containerID="b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.434044 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4\": container with ID starting with b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4 not found: ID does not exist" containerID="b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.434074 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4"} err="failed to get container status \"b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4\": rpc error: code = NotFound desc = could not find container \"b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4\": container with ID starting with b2fe2b4a7ffdd3847233ca0f238e40c4fdc3d26b490a36eeb334a60b03668fc4 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.434088 4702 scope.go:117] "RemoveContainer" containerID="108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.452513 4702 scope.go:117] "RemoveContainer" containerID="6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.474293 4702 scope.go:117] "RemoveContainer" containerID="0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.490361 4702 scope.go:117] "RemoveContainer" containerID="108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.490959 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30\": container with ID starting with 108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30 not found: ID does not exist" containerID="108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.490996 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30"} err="failed to get container status \"108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30\": rpc error: code = NotFound desc = could not find container \"108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30\": container with ID starting with 108d1ece4e32d9094b71e3763d5974cb6d0e5ca2efaf4cbdc0115196dac90d30 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.491027 4702 scope.go:117] "RemoveContainer" containerID="6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.491540 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73\": container with ID starting with 6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73 not found: ID does not exist" containerID="6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.491572 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73"} err="failed to get container status \"6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73\": rpc error: code = NotFound desc = could not find container \"6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73\": container with ID starting with 6bd5e1d1b00758d7acf2f8a94b744021ab7d8dc294935b0f56084827eb25ea73 not found: ID does not exist" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.491588 4702 scope.go:117] "RemoveContainer" containerID="0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695" Dec 03 11:09:39 crc kubenswrapper[4702]: E1203 11:09:39.492133 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695\": container with ID starting with 0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695 not found: ID does not exist" containerID="0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695" Dec 03 11:09:39 crc kubenswrapper[4702]: I1203 11:09:39.492165 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695"} err="failed to get container status \"0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695\": rpc error: code = NotFound desc = could not find container \"0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695\": container with ID starting with 0ea81f1d0dda428556c666b77bc3c1ce1487d4ff4f3243de8b1884792b835695 not found: ID does not exist" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.077740 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" event={"ID":"f2c1609d-33a3-444f-9370-24495b15b3e0","Type":"ContainerStarted","Data":"a81f1063f7fadaf10230df233d66acfed0f64fdf8e3e581acf2698167e70cbdb"} Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.077857 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" event={"ID":"f2c1609d-33a3-444f-9370-24495b15b3e0","Type":"ContainerStarted","Data":"121c5ca2c6809282c63b073cf3203c0a20b8e3b10fcbc64ed13d8d7ceea15593"} Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.078158 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.081275 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.100502 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" podStartSLOduration=2.100462413 podStartE2EDuration="2.100462413s" podCreationTimestamp="2025-12-03 11:09:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:09:40.095197414 +0000 UTC m=+363.931125878" watchObservedRunningTime="2025-12-03 11:09:40.100462413 +0000 UTC m=+363.936390877" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.238622 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6sd9l"] Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.238965 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.238980 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.238991 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerName="extract-content" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239003 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerName="extract-content" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239022 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerName="extract-content" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239028 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerName="extract-content" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239040 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239046 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239060 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" containerName="extract-utilities" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239067 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" containerName="extract-utilities" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239075 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" containerName="extract-content" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239082 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" containerName="extract-content" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239091 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerName="extract-utilities" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239099 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerName="extract-utilities" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239113 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239122 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239130 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239136 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239144 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="extract-utilities" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239151 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="extract-utilities" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239162 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239169 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239177 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="extract-content" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239185 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="extract-content" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239193 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239199 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" Dec 03 11:09:40 crc kubenswrapper[4702]: E1203 11:09:40.239208 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerName="extract-utilities" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239215 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerName="extract-utilities" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239314 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239324 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239335 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239342 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" containerName="marketplace-operator" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239351 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.239359 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" containerName="registry-server" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.240251 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.242366 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.254032 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sd9l"] Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.343210 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8c3262-d494-4427-8228-df9584c00ca1-catalog-content\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.344088 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7fm6\" (UniqueName: \"kubernetes.io/projected/ea8c3262-d494-4427-8228-df9584c00ca1-kube-api-access-c7fm6\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.344161 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8c3262-d494-4427-8228-df9584c00ca1-utilities\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.436202 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qhss9"] Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.438362 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.441131 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.445988 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7fm6\" (UniqueName: \"kubernetes.io/projected/ea8c3262-d494-4427-8228-df9584c00ca1-kube-api-access-c7fm6\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.446039 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8c3262-d494-4427-8228-df9584c00ca1-utilities\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.446077 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480aa817-7d43-4ea8-9099-06bcb431e578-utilities\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.446106 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480aa817-7d43-4ea8-9099-06bcb431e578-catalog-content\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.446128 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz6dh\" (UniqueName: \"kubernetes.io/projected/480aa817-7d43-4ea8-9099-06bcb431e578-kube-api-access-tz6dh\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.446159 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8c3262-d494-4427-8228-df9584c00ca1-catalog-content\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.446772 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea8c3262-d494-4427-8228-df9584c00ca1-catalog-content\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.447371 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea8c3262-d494-4427-8228-df9584c00ca1-utilities\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.452215 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhss9"] Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.472168 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7fm6\" (UniqueName: \"kubernetes.io/projected/ea8c3262-d494-4427-8228-df9584c00ca1-kube-api-access-c7fm6\") pod \"redhat-marketplace-6sd9l\" (UID: \"ea8c3262-d494-4427-8228-df9584c00ca1\") " pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.548252 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480aa817-7d43-4ea8-9099-06bcb431e578-utilities\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.548342 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480aa817-7d43-4ea8-9099-06bcb431e578-catalog-content\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.548381 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6dh\" (UniqueName: \"kubernetes.io/projected/480aa817-7d43-4ea8-9099-06bcb431e578-kube-api-access-tz6dh\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.549157 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/480aa817-7d43-4ea8-9099-06bcb431e578-utilities\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.549208 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/480aa817-7d43-4ea8-9099-06bcb431e578-catalog-content\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.560121 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.570120 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz6dh\" (UniqueName: \"kubernetes.io/projected/480aa817-7d43-4ea8-9099-06bcb431e578-kube-api-access-tz6dh\") pod \"redhat-operators-qhss9\" (UID: \"480aa817-7d43-4ea8-9099-06bcb431e578\") " pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.757320 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.938104 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="085dd40d-8d1f-40ce-903b-fbed55010a29" path="/var/lib/kubelet/pods/085dd40d-8d1f-40ce-903b-fbed55010a29/volumes" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.939356 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac" path="/var/lib/kubelet/pods/5d820d7c-b3ef-4baf-8dc3-af7ca9c2afac/volumes" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.939898 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab09df47-f81e-4b91-aee1-89e919c149ee" path="/var/lib/kubelet/pods/ab09df47-f81e-4b91-aee1-89e919c149ee/volumes" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.941048 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aefd671c-4583-4057-aebd-1c8c7931771f" path="/var/lib/kubelet/pods/aefd671c-4583-4057-aebd-1c8c7931771f/volumes" Dec 03 11:09:40 crc kubenswrapper[4702]: I1203 11:09:40.941657 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b806ad42-5c69-4ea6-8d36-fb54595132bf" path="/var/lib/kubelet/pods/b806ad42-5c69-4ea6-8d36-fb54595132bf/volumes" Dec 03 11:09:41 crc kubenswrapper[4702]: I1203 11:09:41.004769 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6sd9l"] Dec 03 11:09:41 crc kubenswrapper[4702]: W1203 11:09:41.007588 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea8c3262_d494_4427_8228_df9584c00ca1.slice/crio-27b3bed165b2d85cdb9bf67201929a9ed4a449fd5b540164b8798231f8f8ccd4 WatchSource:0}: Error finding container 27b3bed165b2d85cdb9bf67201929a9ed4a449fd5b540164b8798231f8f8ccd4: Status 404 returned error can't find the container with id 27b3bed165b2d85cdb9bf67201929a9ed4a449fd5b540164b8798231f8f8ccd4 Dec 03 11:09:41 crc kubenswrapper[4702]: I1203 11:09:41.110527 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sd9l" event={"ID":"ea8c3262-d494-4427-8228-df9584c00ca1","Type":"ContainerStarted","Data":"27b3bed165b2d85cdb9bf67201929a9ed4a449fd5b540164b8798231f8f8ccd4"} Dec 03 11:09:41 crc kubenswrapper[4702]: I1203 11:09:41.172588 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qhss9"] Dec 03 11:09:41 crc kubenswrapper[4702]: W1203 11:09:41.236586 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480aa817_7d43_4ea8_9099_06bcb431e578.slice/crio-e7dd884b45e4660e89b06a275dc7126fef6f1499ba9d8a4e1a3b36c9eb34bf0f WatchSource:0}: Error finding container e7dd884b45e4660e89b06a275dc7126fef6f1499ba9d8a4e1a3b36c9eb34bf0f: Status 404 returned error can't find the container with id e7dd884b45e4660e89b06a275dc7126fef6f1499ba9d8a4e1a3b36c9eb34bf0f Dec 03 11:09:41 crc kubenswrapper[4702]: E1203 11:09:41.480130 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480aa817_7d43_4ea8_9099_06bcb431e578.slice/crio-conmon-75dafac6fd4722e7b13a91c7bab9204d3d7f88377f0c4310a2d0638de3bc7b6f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod480aa817_7d43_4ea8_9099_06bcb431e578.slice/crio-75dafac6fd4722e7b13a91c7bab9204d3d7f88377f0c4310a2d0638de3bc7b6f.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.121341 4702 generic.go:334] "Generic (PLEG): container finished" podID="ea8c3262-d494-4427-8228-df9584c00ca1" containerID="7a46ce8b8d3a888602dba9e23c9547f4b36456244aa9ecd2df85ad13758fc3f4" exitCode=0 Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.121438 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sd9l" event={"ID":"ea8c3262-d494-4427-8228-df9584c00ca1","Type":"ContainerDied","Data":"7a46ce8b8d3a888602dba9e23c9547f4b36456244aa9ecd2df85ad13758fc3f4"} Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.125027 4702 generic.go:334] "Generic (PLEG): container finished" podID="480aa817-7d43-4ea8-9099-06bcb431e578" containerID="75dafac6fd4722e7b13a91c7bab9204d3d7f88377f0c4310a2d0638de3bc7b6f" exitCode=0 Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.125079 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhss9" event={"ID":"480aa817-7d43-4ea8-9099-06bcb431e578","Type":"ContainerDied","Data":"75dafac6fd4722e7b13a91c7bab9204d3d7f88377f0c4310a2d0638de3bc7b6f"} Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.125175 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhss9" event={"ID":"480aa817-7d43-4ea8-9099-06bcb431e578","Type":"ContainerStarted","Data":"e7dd884b45e4660e89b06a275dc7126fef6f1499ba9d8a4e1a3b36c9eb34bf0f"} Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.643164 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g44ws"] Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.644779 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.648638 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.656108 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g44ws"] Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.681912 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-catalog-content\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.681992 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-utilities\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.682023 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qm6h\" (UniqueName: \"kubernetes.io/projected/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-kube-api-access-9qm6h\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.783387 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-utilities\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.783460 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qm6h\" (UniqueName: \"kubernetes.io/projected/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-kube-api-access-9qm6h\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.783533 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-catalog-content\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.784156 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-utilities\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.784204 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-catalog-content\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.810917 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qm6h\" (UniqueName: \"kubernetes.io/projected/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-kube-api-access-9qm6h\") pod \"certified-operators-g44ws\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.842529 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6cqt"] Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.844827 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.848026 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.856495 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6cqt"] Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.974176 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.986930 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-utilities\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.986992 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnrsg\" (UniqueName: \"kubernetes.io/projected/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-kube-api-access-bnrsg\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:42 crc kubenswrapper[4702]: I1203 11:09:42.987036 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-catalog-content\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.089475 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-utilities\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.089567 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnrsg\" (UniqueName: \"kubernetes.io/projected/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-kube-api-access-bnrsg\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.089625 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-catalog-content\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.090465 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-catalog-content\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.090906 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-utilities\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.113509 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnrsg\" (UniqueName: \"kubernetes.io/projected/0f2dd872-6ac4-4527-9a91-218b1de5ed5e-kube-api-access-bnrsg\") pod \"community-operators-m6cqt\" (UID: \"0f2dd872-6ac4-4527-9a91-218b1de5ed5e\") " pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.136385 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sd9l" event={"ID":"ea8c3262-d494-4427-8228-df9584c00ca1","Type":"ContainerStarted","Data":"fdcc11d7bd938ed05614cce6994e7e43a888c19f1e1c627d706ce43718de5602"} Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.214246 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.414145 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g44ws"] Dec 03 11:09:43 crc kubenswrapper[4702]: W1203 11:09:43.421556 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5eb7b34c_d8e9_4188_85c1_7be8ec5afa18.slice/crio-3650711afae3a4263770a6a7995b2b1750703591953c96bef15781c8fb156939 WatchSource:0}: Error finding container 3650711afae3a4263770a6a7995b2b1750703591953c96bef15781c8fb156939: Status 404 returned error can't find the container with id 3650711afae3a4263770a6a7995b2b1750703591953c96bef15781c8fb156939 Dec 03 11:09:43 crc kubenswrapper[4702]: I1203 11:09:43.640907 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6cqt"] Dec 03 11:09:43 crc kubenswrapper[4702]: W1203 11:09:43.673249 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f2dd872_6ac4_4527_9a91_218b1de5ed5e.slice/crio-4efe61e5f6f2a0d69f5c3fccc95306ea7a8272c0f13dbb8945d7b67b37dbd296 WatchSource:0}: Error finding container 4efe61e5f6f2a0d69f5c3fccc95306ea7a8272c0f13dbb8945d7b67b37dbd296: Status 404 returned error can't find the container with id 4efe61e5f6f2a0d69f5c3fccc95306ea7a8272c0f13dbb8945d7b67b37dbd296 Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.148067 4702 generic.go:334] "Generic (PLEG): container finished" podID="480aa817-7d43-4ea8-9099-06bcb431e578" containerID="63e3311849127b8d85c69eb96c55a12ca9821c926a8cc88633b14b8a17956c13" exitCode=0 Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.148197 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhss9" event={"ID":"480aa817-7d43-4ea8-9099-06bcb431e578","Type":"ContainerDied","Data":"63e3311849127b8d85c69eb96c55a12ca9821c926a8cc88633b14b8a17956c13"} Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.151962 4702 generic.go:334] "Generic (PLEG): container finished" podID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerID="8db2de7cd02f7a158f7b0f58f347c1760ad4f0d7461464c3617c33a6f8ee2451" exitCode=0 Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.152403 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6cqt" event={"ID":"0f2dd872-6ac4-4527-9a91-218b1de5ed5e","Type":"ContainerDied","Data":"8db2de7cd02f7a158f7b0f58f347c1760ad4f0d7461464c3617c33a6f8ee2451"} Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.152466 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6cqt" event={"ID":"0f2dd872-6ac4-4527-9a91-218b1de5ed5e","Type":"ContainerStarted","Data":"4efe61e5f6f2a0d69f5c3fccc95306ea7a8272c0f13dbb8945d7b67b37dbd296"} Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.163844 4702 generic.go:334] "Generic (PLEG): container finished" podID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerID="16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be" exitCode=0 Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.164009 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g44ws" event={"ID":"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18","Type":"ContainerDied","Data":"16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be"} Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.164062 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g44ws" event={"ID":"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18","Type":"ContainerStarted","Data":"3650711afae3a4263770a6a7995b2b1750703591953c96bef15781c8fb156939"} Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.176517 4702 generic.go:334] "Generic (PLEG): container finished" podID="ea8c3262-d494-4427-8228-df9584c00ca1" containerID="fdcc11d7bd938ed05614cce6994e7e43a888c19f1e1c627d706ce43718de5602" exitCode=0 Dec 03 11:09:44 crc kubenswrapper[4702]: I1203 11:09:44.176604 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sd9l" event={"ID":"ea8c3262-d494-4427-8228-df9584c00ca1","Type":"ContainerDied","Data":"fdcc11d7bd938ed05614cce6994e7e43a888c19f1e1c627d706ce43718de5602"} Dec 03 11:09:45 crc kubenswrapper[4702]: I1203 11:09:45.185551 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6cqt" event={"ID":"0f2dd872-6ac4-4527-9a91-218b1de5ed5e","Type":"ContainerStarted","Data":"b7bbb68ae7ee903d201fa4da1c748bd347bdd7e4614d30c8959ef4133229bb0f"} Dec 03 11:09:45 crc kubenswrapper[4702]: I1203 11:09:45.188496 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sd9l" event={"ID":"ea8c3262-d494-4427-8228-df9584c00ca1","Type":"ContainerStarted","Data":"c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28"} Dec 03 11:09:45 crc kubenswrapper[4702]: I1203 11:09:45.191820 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhss9" event={"ID":"480aa817-7d43-4ea8-9099-06bcb431e578","Type":"ContainerStarted","Data":"18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc"} Dec 03 11:09:45 crc kubenswrapper[4702]: I1203 11:09:45.251352 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qhss9" podStartSLOduration=2.74210399 podStartE2EDuration="5.251317627s" podCreationTimestamp="2025-12-03 11:09:40 +0000 UTC" firstStartedPulling="2025-12-03 11:09:42.126905421 +0000 UTC m=+365.962833875" lastFinishedPulling="2025-12-03 11:09:44.636119048 +0000 UTC m=+368.472047512" observedRunningTime="2025-12-03 11:09:45.248555188 +0000 UTC m=+369.084483652" watchObservedRunningTime="2025-12-03 11:09:45.251317627 +0000 UTC m=+369.087246091" Dec 03 11:09:45 crc kubenswrapper[4702]: I1203 11:09:45.270984 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6sd9l" podStartSLOduration=2.794421521 podStartE2EDuration="5.270951703s" podCreationTimestamp="2025-12-03 11:09:40 +0000 UTC" firstStartedPulling="2025-12-03 11:09:42.124154463 +0000 UTC m=+365.960082927" lastFinishedPulling="2025-12-03 11:09:44.600684655 +0000 UTC m=+368.436613109" observedRunningTime="2025-12-03 11:09:45.268136653 +0000 UTC m=+369.104065117" watchObservedRunningTime="2025-12-03 11:09:45.270951703 +0000 UTC m=+369.106880167" Dec 03 11:09:46 crc kubenswrapper[4702]: I1203 11:09:46.202031 4702 generic.go:334] "Generic (PLEG): container finished" podID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerID="b7bbb68ae7ee903d201fa4da1c748bd347bdd7e4614d30c8959ef4133229bb0f" exitCode=0 Dec 03 11:09:46 crc kubenswrapper[4702]: I1203 11:09:46.202148 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6cqt" event={"ID":"0f2dd872-6ac4-4527-9a91-218b1de5ed5e","Type":"ContainerDied","Data":"b7bbb68ae7ee903d201fa4da1c748bd347bdd7e4614d30c8959ef4133229bb0f"} Dec 03 11:09:46 crc kubenswrapper[4702]: I1203 11:09:46.204277 4702 generic.go:334] "Generic (PLEG): container finished" podID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerID="6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012" exitCode=0 Dec 03 11:09:46 crc kubenswrapper[4702]: I1203 11:09:46.204478 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g44ws" event={"ID":"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18","Type":"ContainerDied","Data":"6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012"} Dec 03 11:09:47 crc kubenswrapper[4702]: I1203 11:09:47.223355 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6cqt" event={"ID":"0f2dd872-6ac4-4527-9a91-218b1de5ed5e","Type":"ContainerStarted","Data":"fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d"} Dec 03 11:09:47 crc kubenswrapper[4702]: I1203 11:09:47.244857 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6cqt" podStartSLOduration=2.42216745 podStartE2EDuration="5.244823962s" podCreationTimestamp="2025-12-03 11:09:42 +0000 UTC" firstStartedPulling="2025-12-03 11:09:44.154242114 +0000 UTC m=+367.990170578" lastFinishedPulling="2025-12-03 11:09:46.976898626 +0000 UTC m=+370.812827090" observedRunningTime="2025-12-03 11:09:47.243128934 +0000 UTC m=+371.079057398" watchObservedRunningTime="2025-12-03 11:09:47.244823962 +0000 UTC m=+371.080752426" Dec 03 11:09:48 crc kubenswrapper[4702]: I1203 11:09:48.237126 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g44ws" event={"ID":"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18","Type":"ContainerStarted","Data":"8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657"} Dec 03 11:09:50 crc kubenswrapper[4702]: I1203 11:09:50.561325 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:50 crc kubenswrapper[4702]: I1203 11:09:50.562152 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:50 crc kubenswrapper[4702]: I1203 11:09:50.610090 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:50 crc kubenswrapper[4702]: I1203 11:09:50.638345 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g44ws" podStartSLOduration=5.750781699 podStartE2EDuration="8.638317527s" podCreationTimestamp="2025-12-03 11:09:42 +0000 UTC" firstStartedPulling="2025-12-03 11:09:44.166472321 +0000 UTC m=+368.002400785" lastFinishedPulling="2025-12-03 11:09:47.054008149 +0000 UTC m=+370.889936613" observedRunningTime="2025-12-03 11:09:47.275633575 +0000 UTC m=+371.111562069" watchObservedRunningTime="2025-12-03 11:09:50.638317527 +0000 UTC m=+374.474245991" Dec 03 11:09:50 crc kubenswrapper[4702]: I1203 11:09:50.758509 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:50 crc kubenswrapper[4702]: I1203 11:09:50.758595 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:50 crc kubenswrapper[4702]: I1203 11:09:50.803226 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:51 crc kubenswrapper[4702]: I1203 11:09:51.306821 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 11:09:51 crc kubenswrapper[4702]: I1203 11:09:51.315104 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 11:09:52 crc kubenswrapper[4702]: I1203 11:09:52.974622 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:52 crc kubenswrapper[4702]: I1203 11:09:52.975547 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:53 crc kubenswrapper[4702]: I1203 11:09:53.026255 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:53 crc kubenswrapper[4702]: I1203 11:09:53.214910 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:53 crc kubenswrapper[4702]: I1203 11:09:53.214987 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:53 crc kubenswrapper[4702]: I1203 11:09:53.262738 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:53 crc kubenswrapper[4702]: I1203 11:09:53.328064 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 11:09:53 crc kubenswrapper[4702]: I1203 11:09:53.332210 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:09:55 crc kubenswrapper[4702]: I1203 11:09:55.908636 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:09:55 crc kubenswrapper[4702]: I1203 11:09:55.908735 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:09:57 crc kubenswrapper[4702]: I1203 11:09:57.148311 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" podUID="fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" containerName="registry" containerID="cri-o://9cc460beecb02e23233dafa6af01c5deecb7be8f23e85f7b31f51207d3014214" gracePeriod=30 Dec 03 11:09:59 crc kubenswrapper[4702]: I1203 11:09:59.311158 4702 generic.go:334] "Generic (PLEG): container finished" podID="fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" containerID="9cc460beecb02e23233dafa6af01c5deecb7be8f23e85f7b31f51207d3014214" exitCode=0 Dec 03 11:09:59 crc kubenswrapper[4702]: I1203 11:09:59.311269 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" event={"ID":"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b","Type":"ContainerDied","Data":"9cc460beecb02e23233dafa6af01c5deecb7be8f23e85f7b31f51207d3014214"} Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.250356 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.319070 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" event={"ID":"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b","Type":"ContainerDied","Data":"e99a811080cdd3677dc89f6831b1136fca6f5214a9099b8babc6b9b78d63a369"} Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.319119 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-lnnbr" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.319159 4702 scope.go:117] "RemoveContainer" containerID="9cc460beecb02e23233dafa6af01c5deecb7be8f23e85f7b31f51207d3014214" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.393963 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-ca-trust-extracted\") pod \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.394020 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-certificates\") pod \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.394112 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-installation-pull-secrets\") pod \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.394177 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86dwp\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-kube-api-access-86dwp\") pod \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.394423 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.394493 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-trusted-ca\") pod \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.394533 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-bound-sa-token\") pod \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.395183 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.396387 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.396524 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-tls\") pod \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\" (UID: \"fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b\") " Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.396850 4702 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.396875 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.404613 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.416736 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.417392 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.418557 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-kube-api-access-86dwp" (OuterVolumeSpecName: "kube-api-access-86dwp") pod "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b"). InnerVolumeSpecName "kube-api-access-86dwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.431472 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.438646 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" (UID: "fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.497358 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86dwp\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-kube-api-access-86dwp\") on node \"crc\" DevicePath \"\"" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.497856 4702 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.497949 4702 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.498025 4702 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.498090 4702 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.651246 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnnbr"] Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.659432 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-lnnbr"] Dec 03 11:10:00 crc kubenswrapper[4702]: I1203 11:10:00.939360 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" path="/var/lib/kubelet/pods/fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b/volumes" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.533932 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt"] Dec 03 11:10:08 crc kubenswrapper[4702]: E1203 11:10:08.534821 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" containerName="registry" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.534840 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" containerName="registry" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.534965 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdbd7f19-99d5-4efc-a0aa-07adf75e0f6b" containerName="registry" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.535533 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.545407 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.547259 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.547978 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.547963 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.561296 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.574507 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt"] Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.716176 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt9ld\" (UniqueName: \"kubernetes.io/projected/981b85d3-383b-4fe8-9e30-d4f6875fc223-kube-api-access-wt9ld\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.716273 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/981b85d3-383b-4fe8-9e30-d4f6875fc223-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.716325 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/981b85d3-383b-4fe8-9e30-d4f6875fc223-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.817796 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt9ld\" (UniqueName: \"kubernetes.io/projected/981b85d3-383b-4fe8-9e30-d4f6875fc223-kube-api-access-wt9ld\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.817877 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/981b85d3-383b-4fe8-9e30-d4f6875fc223-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.817924 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/981b85d3-383b-4fe8-9e30-d4f6875fc223-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.819577 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/981b85d3-383b-4fe8-9e30-d4f6875fc223-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.827441 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/981b85d3-383b-4fe8-9e30-d4f6875fc223-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.836240 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt9ld\" (UniqueName: \"kubernetes.io/projected/981b85d3-383b-4fe8-9e30-d4f6875fc223-kube-api-access-wt9ld\") pod \"cluster-monitoring-operator-6d5b84845-h2hvt\" (UID: \"981b85d3-383b-4fe8-9e30-d4f6875fc223\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:08 crc kubenswrapper[4702]: I1203 11:10:08.861223 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" Dec 03 11:10:09 crc kubenswrapper[4702]: I1203 11:10:09.272152 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt"] Dec 03 11:10:09 crc kubenswrapper[4702]: I1203 11:10:09.382726 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" event={"ID":"981b85d3-383b-4fe8-9e30-d4f6875fc223","Type":"ContainerStarted","Data":"60f515675db277fff7a9f11f1d68712efa00ca4e29957ef68d5dda131c46b37e"} Dec 03 11:10:11 crc kubenswrapper[4702]: I1203 11:10:11.399153 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" event={"ID":"981b85d3-383b-4fe8-9e30-d4f6875fc223","Type":"ContainerStarted","Data":"018447c3807efe17dc73409cc98c7722aa73f359b6e74251be78cd53a3b8a024"} Dec 03 11:10:11 crc kubenswrapper[4702]: I1203 11:10:11.417865 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-h2hvt" podStartSLOduration=1.493134591 podStartE2EDuration="3.417839716s" podCreationTimestamp="2025-12-03 11:10:08 +0000 UTC" firstStartedPulling="2025-12-03 11:10:09.279477101 +0000 UTC m=+393.115405575" lastFinishedPulling="2025-12-03 11:10:11.204182236 +0000 UTC m=+395.040110700" observedRunningTime="2025-12-03 11:10:11.415690793 +0000 UTC m=+395.251619287" watchObservedRunningTime="2025-12-03 11:10:11.417839716 +0000 UTC m=+395.253768180" Dec 03 11:10:11 crc kubenswrapper[4702]: I1203 11:10:11.799673 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl"] Dec 03 11:10:11 crc kubenswrapper[4702]: I1203 11:10:11.800749 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 11:10:11 crc kubenswrapper[4702]: I1203 11:10:11.803201 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Dec 03 11:10:11 crc kubenswrapper[4702]: I1203 11:10:11.807164 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-6lrkw" Dec 03 11:10:11 crc kubenswrapper[4702]: I1203 11:10:11.818989 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl"] Dec 03 11:10:11 crc kubenswrapper[4702]: I1203 11:10:11.960978 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/44af00fd-b9f6-4e74-ad67-581e4ca7527c-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-q4rbl\" (UID: \"44af00fd-b9f6-4e74-ad67-581e4ca7527c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 11:10:12 crc kubenswrapper[4702]: I1203 11:10:12.063187 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/44af00fd-b9f6-4e74-ad67-581e4ca7527c-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-q4rbl\" (UID: \"44af00fd-b9f6-4e74-ad67-581e4ca7527c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 11:10:12 crc kubenswrapper[4702]: I1203 11:10:12.071163 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/44af00fd-b9f6-4e74-ad67-581e4ca7527c-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-q4rbl\" (UID: \"44af00fd-b9f6-4e74-ad67-581e4ca7527c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 11:10:12 crc kubenswrapper[4702]: I1203 11:10:12.130069 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 11:10:12 crc kubenswrapper[4702]: I1203 11:10:12.549806 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl"] Dec 03 11:10:13 crc kubenswrapper[4702]: I1203 11:10:13.411924 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" event={"ID":"44af00fd-b9f6-4e74-ad67-581e4ca7527c","Type":"ContainerStarted","Data":"6ee86e134a1f94fcffc215155c73902a1c67014ce16b8d0668558bc105786cc1"} Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.422436 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" event={"ID":"44af00fd-b9f6-4e74-ad67-581e4ca7527c","Type":"ContainerStarted","Data":"8c83b33dbc7dd119d0b2ca7b2933d6a165caba4682749655a9e5a2534e77d2e0"} Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.422893 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.431026 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.442087 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podStartSLOduration=2.193887446 podStartE2EDuration="3.442056504s" podCreationTimestamp="2025-12-03 11:10:11 +0000 UTC" firstStartedPulling="2025-12-03 11:10:12.562852275 +0000 UTC m=+396.398780739" lastFinishedPulling="2025-12-03 11:10:13.811021333 +0000 UTC m=+397.646949797" observedRunningTime="2025-12-03 11:10:14.439653564 +0000 UTC m=+398.275582078" watchObservedRunningTime="2025-12-03 11:10:14.442056504 +0000 UTC m=+398.277985008" Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.852601 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-pkl6l"] Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.854017 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.858671 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.858700 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.859120 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-64dnt" Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.859521 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Dec 03 11:10:14 crc kubenswrapper[4702]: I1203 11:10:14.863785 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-pkl6l"] Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.018870 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1210630f-a824-4a50-b21e-4a9ae455b879-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.019109 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1210630f-a824-4a50-b21e-4a9ae455b879-metrics-client-ca\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.019255 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52x2q\" (UniqueName: \"kubernetes.io/projected/1210630f-a824-4a50-b21e-4a9ae455b879-kube-api-access-52x2q\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.019321 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1210630f-a824-4a50-b21e-4a9ae455b879-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.121584 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1210630f-a824-4a50-b21e-4a9ae455b879-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.121729 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1210630f-a824-4a50-b21e-4a9ae455b879-metrics-client-ca\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.121895 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52x2q\" (UniqueName: \"kubernetes.io/projected/1210630f-a824-4a50-b21e-4a9ae455b879-kube-api-access-52x2q\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.121940 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1210630f-a824-4a50-b21e-4a9ae455b879-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.123323 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1210630f-a824-4a50-b21e-4a9ae455b879-metrics-client-ca\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.128806 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1210630f-a824-4a50-b21e-4a9ae455b879-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.129145 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1210630f-a824-4a50-b21e-4a9ae455b879-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.145533 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52x2q\" (UniqueName: \"kubernetes.io/projected/1210630f-a824-4a50-b21e-4a9ae455b879-kube-api-access-52x2q\") pod \"prometheus-operator-db54df47d-pkl6l\" (UID: \"1210630f-a824-4a50-b21e-4a9ae455b879\") " pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.171524 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.575277 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-pkl6l"] Dec 03 11:10:15 crc kubenswrapper[4702]: W1203 11:10:15.600593 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1210630f_a824_4a50_b21e_4a9ae455b879.slice/crio-11939f831e6b75bcd11ddfd836d8f7d83e7985da7afb40c51a420f6cd0194c9b WatchSource:0}: Error finding container 11939f831e6b75bcd11ddfd836d8f7d83e7985da7afb40c51a420f6cd0194c9b: Status 404 returned error can't find the container with id 11939f831e6b75bcd11ddfd836d8f7d83e7985da7afb40c51a420f6cd0194c9b Dec 03 11:10:15 crc kubenswrapper[4702]: I1203 11:10:15.604324 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:10:16 crc kubenswrapper[4702]: I1203 11:10:16.436003 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" event={"ID":"1210630f-a824-4a50-b21e-4a9ae455b879","Type":"ContainerStarted","Data":"11939f831e6b75bcd11ddfd836d8f7d83e7985da7afb40c51a420f6cd0194c9b"} Dec 03 11:10:18 crc kubenswrapper[4702]: I1203 11:10:18.450131 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" event={"ID":"1210630f-a824-4a50-b21e-4a9ae455b879","Type":"ContainerStarted","Data":"475ce82b575955775e0d2a885e019831b21abb7ef2078b9de1d86bf8d44b6827"} Dec 03 11:10:18 crc kubenswrapper[4702]: I1203 11:10:18.450538 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" event={"ID":"1210630f-a824-4a50-b21e-4a9ae455b879","Type":"ContainerStarted","Data":"a00c9e584f62f49e2caa2be70bf4ec0a1f517804f085b2822e08933b689660be"} Dec 03 11:10:18 crc kubenswrapper[4702]: I1203 11:10:18.468324 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-pkl6l" podStartSLOduration=2.445536175 podStartE2EDuration="4.468301686s" podCreationTimestamp="2025-12-03 11:10:14 +0000 UTC" firstStartedPulling="2025-12-03 11:10:15.604075008 +0000 UTC m=+399.440003472" lastFinishedPulling="2025-12-03 11:10:17.626840509 +0000 UTC m=+401.462768983" observedRunningTime="2025-12-03 11:10:18.464391463 +0000 UTC m=+402.300319947" watchObservedRunningTime="2025-12-03 11:10:18.468301686 +0000 UTC m=+402.304230150" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.219081 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf"] Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.220537 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.223457 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.223648 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-b4mzg" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.223396 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.231057 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ffz7k"] Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.232588 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.234164 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.234568 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.236983 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-w9rpd" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.238598 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf"] Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295486 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295533 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-textfile\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295564 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295583 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/355ba2ac-2687-4315-85e9-6602fd79cfb2-metrics-client-ca\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295602 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-root\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295619 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slw8q\" (UniqueName: \"kubernetes.io/projected/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-kube-api-access-slw8q\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295673 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-tls\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295689 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv2h8\" (UniqueName: \"kubernetes.io/projected/355ba2ac-2687-4315-85e9-6602fd79cfb2-kube-api-access-bv2h8\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295710 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295742 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295772 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-sys\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.295797 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-wtmp\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.300742 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq"] Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.302422 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.304904 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.304982 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.305296 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-qqxm4" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.308572 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.322822 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq"] Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.396944 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397162 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-textfile\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397241 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397282 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/355ba2ac-2687-4315-85e9-6602fd79cfb2-metrics-client-ca\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397312 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-root\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397335 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slw8q\" (UniqueName: \"kubernetes.io/projected/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-kube-api-access-slw8q\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397452 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-tls\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397474 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv2h8\" (UniqueName: \"kubernetes.io/projected/355ba2ac-2687-4315-85e9-6602fd79cfb2-kube-api-access-bv2h8\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397503 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397534 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397554 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-sys\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397591 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-wtmp\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.397806 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-wtmp\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.398156 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-textfile\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.398148 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-root\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.398184 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/355ba2ac-2687-4315-85e9-6602fd79cfb2-sys\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: E1203 11:10:20.398662 4702 secret.go:188] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Dec 03 11:10:20 crc kubenswrapper[4702]: E1203 11:10:20.398733 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-tls podName:355ba2ac-2687-4315-85e9-6602fd79cfb2 nodeName:}" failed. No retries permitted until 2025-12-03 11:10:20.898716954 +0000 UTC m=+404.734645488 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-tls") pod "node-exporter-ffz7k" (UID: "355ba2ac-2687-4315-85e9-6602fd79cfb2") : secret "node-exporter-tls" not found Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.398964 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.398989 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/355ba2ac-2687-4315-85e9-6602fd79cfb2-metrics-client-ca\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.419322 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.420358 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.421249 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.421381 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slw8q\" (UniqueName: \"kubernetes.io/projected/c7e7c2c8-5bc3-44f8-940e-cbde0e505e87-kube-api-access-slw8q\") pod \"openshift-state-metrics-566fddb674-8xbcf\" (UID: \"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.435559 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv2h8\" (UniqueName: \"kubernetes.io/projected/355ba2ac-2687-4315-85e9-6602fd79cfb2-kube-api-access-bv2h8\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.498644 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.498725 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.498834 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.498861 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2244316-61b9-4b19-afe9-514c0b989989-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.498889 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2244316-61b9-4b19-afe9-514c0b989989-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.498964 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2td76\" (UniqueName: \"kubernetes.io/projected/a2244316-61b9-4b19-afe9-514c0b989989-kube-api-access-2td76\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.542119 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.600360 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2td76\" (UniqueName: \"kubernetes.io/projected/a2244316-61b9-4b19-afe9-514c0b989989-kube-api-access-2td76\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.600425 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.600462 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.600510 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.600534 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2244316-61b9-4b19-afe9-514c0b989989-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.600559 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2244316-61b9-4b19-afe9-514c0b989989-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.601217 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/a2244316-61b9-4b19-afe9-514c0b989989-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: E1203 11:10:20.601884 4702 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Dec 03 11:10:20 crc kubenswrapper[4702]: E1203 11:10:20.601982 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-tls podName:a2244316-61b9-4b19-afe9-514c0b989989 nodeName:}" failed. No retries permitted until 2025-12-03 11:10:21.101962261 +0000 UTC m=+404.937890725 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-mtldq" (UID: "a2244316-61b9-4b19-afe9-514c0b989989") : secret "kube-state-metrics-tls" not found Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.602186 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.602470 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a2244316-61b9-4b19-afe9-514c0b989989-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.619116 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2td76\" (UniqueName: \"kubernetes.io/projected/a2244316-61b9-4b19-afe9-514c0b989989-kube-api-access-2td76\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.619223 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.905438 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-tls\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.909620 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/355ba2ac-2687-4315-85e9-6602fd79cfb2-node-exporter-tls\") pod \"node-exporter-ffz7k\" (UID: \"355ba2ac-2687-4315-85e9-6602fd79cfb2\") " pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:20 crc kubenswrapper[4702]: I1203 11:10:20.991952 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf"] Dec 03 11:10:21 crc kubenswrapper[4702]: W1203 11:10:21.001599 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7e7c2c8_5bc3_44f8_940e_cbde0e505e87.slice/crio-d4c37a7d4570752df7bd9a2a05022b8b80bdee89e6c6877926434108a3079831 WatchSource:0}: Error finding container d4c37a7d4570752df7bd9a2a05022b8b80bdee89e6c6877926434108a3079831: Status 404 returned error can't find the container with id d4c37a7d4570752df7bd9a2a05022b8b80bdee89e6c6877926434108a3079831 Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.107713 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.112853 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/a2244316-61b9-4b19-afe9-514c0b989989-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-mtldq\" (UID: \"a2244316-61b9-4b19-afe9-514c0b989989\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.150836 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ffz7k" Dec 03 11:10:21 crc kubenswrapper[4702]: W1203 11:10:21.176277 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod355ba2ac_2687_4315_85e9_6602fd79cfb2.slice/crio-4639f1b3664f76567aa881a622a13c576dcc8c7fbb5422c2f75bae7c403f4837 WatchSource:0}: Error finding container 4639f1b3664f76567aa881a622a13c576dcc8c7fbb5422c2f75bae7c403f4837: Status 404 returned error can't find the container with id 4639f1b3664f76567aa881a622a13c576dcc8c7fbb5422c2f75bae7c403f4837 Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.222860 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.308195 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.310090 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.313462 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.313524 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-p4ml9" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.314074 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.314369 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.314553 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.314731 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.314940 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.316371 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.327281 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.334908 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413471 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a5cd2e99-2280-4e03-8889-ab105522d3b5-config-out\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413545 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413579 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413635 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn846\" (UniqueName: \"kubernetes.io/projected/a5cd2e99-2280-4e03-8889-ab105522d3b5-kube-api-access-nn846\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413673 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a5cd2e99-2280-4e03-8889-ab105522d3b5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413710 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a5cd2e99-2280-4e03-8889-ab105522d3b5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413775 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-config-volume\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413807 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-web-config\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413832 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5cd2e99-2280-4e03-8889-ab105522d3b5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413862 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413892 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.413918 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5cd2e99-2280-4e03-8889-ab105522d3b5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.479606 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ffz7k" event={"ID":"355ba2ac-2687-4315-85e9-6602fd79cfb2","Type":"ContainerStarted","Data":"4639f1b3664f76567aa881a622a13c576dcc8c7fbb5422c2f75bae7c403f4837"} Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.480816 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" event={"ID":"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87","Type":"ContainerStarted","Data":"d4c37a7d4570752df7bd9a2a05022b8b80bdee89e6c6877926434108a3079831"} Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.515675 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-config-volume\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.515773 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-web-config\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.515807 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5cd2e99-2280-4e03-8889-ab105522d3b5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.515867 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.516040 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.517609 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5cd2e99-2280-4e03-8889-ab105522d3b5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.517658 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a5cd2e99-2280-4e03-8889-ab105522d3b5-config-out\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.517685 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.517718 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.517788 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn846\" (UniqueName: \"kubernetes.io/projected/a5cd2e99-2280-4e03-8889-ab105522d3b5-kube-api-access-nn846\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.517835 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a5cd2e99-2280-4e03-8889-ab105522d3b5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.517874 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a5cd2e99-2280-4e03-8889-ab105522d3b5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.517547 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a5cd2e99-2280-4e03-8889-ab105522d3b5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: E1203 11:10:21.518395 4702 secret.go:188] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Dec 03 11:10:21 crc kubenswrapper[4702]: E1203 11:10:21.518449 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-main-tls podName:a5cd2e99-2280-4e03-8889-ab105522d3b5 nodeName:}" failed. No retries permitted until 2025-12-03 11:10:22.018433927 +0000 UTC m=+405.854362391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "a5cd2e99-2280-4e03-8889-ab105522d3b5") : secret "alertmanager-main-tls" not found Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.518865 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a5cd2e99-2280-4e03-8889-ab105522d3b5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.520327 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a5cd2e99-2280-4e03-8889-ab105522d3b5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.521976 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-web-config\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.523515 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.535437 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.535526 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.535637 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a5cd2e99-2280-4e03-8889-ab105522d3b5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.536167 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-config-volume\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.541747 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn846\" (UniqueName: \"kubernetes.io/projected/a5cd2e99-2280-4e03-8889-ab105522d3b5-kube-api-access-nn846\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.541808 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a5cd2e99-2280-4e03-8889-ab105522d3b5-config-out\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:21 crc kubenswrapper[4702]: I1203 11:10:21.748858 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq"] Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.026228 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.033837 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a5cd2e99-2280-4e03-8889-ab105522d3b5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a5cd2e99-2280-4e03-8889-ab105522d3b5\") " pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.230596 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.308498 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-58bc8556f9-kwpsl"] Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.310126 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.313843 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.313886 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.314082 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-874snlcd4oi05" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.313845 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-q6gbz" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.314282 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.314441 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.314622 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.326438 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-58bc8556f9-kwpsl"] Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.337159 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9fdn\" (UniqueName: \"kubernetes.io/projected/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-kube-api-access-d9fdn\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.337425 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-metrics-client-ca\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.337637 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.337737 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.337836 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.337912 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-grpc-tls\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.337977 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.338106 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-tls\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.439490 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-grpc-tls\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.440079 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.440124 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-tls\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.440155 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9fdn\" (UniqueName: \"kubernetes.io/projected/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-kube-api-access-d9fdn\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.440230 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-metrics-client-ca\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.440290 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.440325 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.440360 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.446067 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.446576 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-grpc-tls\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.446665 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.449436 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-tls\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.451371 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-metrics-client-ca\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.452370 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.453441 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.456574 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9fdn\" (UniqueName: \"kubernetes.io/projected/b2347d45-1235-4ed0-9f48-6e0eb4c781f8-kube-api-access-d9fdn\") pod \"thanos-querier-58bc8556f9-kwpsl\" (UID: \"b2347d45-1235-4ed0-9f48-6e0eb4c781f8\") " pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.538805 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" event={"ID":"a2244316-61b9-4b19-afe9-514c0b989989","Type":"ContainerStarted","Data":"2e22463a6786274222542198492ebdb3d4f68feff46f4132c5f21d39ecc4c770"} Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.542789 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" event={"ID":"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87","Type":"ContainerStarted","Data":"40e1977583266f1d40e2f5fbc9875a9d16c4e6ecdc26fddc47f555c3f1118913"} Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.542883 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" event={"ID":"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87","Type":"ContainerStarted","Data":"3a3ae1a0c795885b6d961dc558adb6750b42622e0561da3bf91acc6122493dd7"} Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.635998 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:22 crc kubenswrapper[4702]: I1203 11:10:22.803488 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Dec 03 11:10:22 crc kubenswrapper[4702]: W1203 11:10:22.829090 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5cd2e99_2280_4e03_8889_ab105522d3b5.slice/crio-a9d4034f6d757b4d06bab4ca660ca96ca9f7433efb39cbb8a084d044732f05f3 WatchSource:0}: Error finding container a9d4034f6d757b4d06bab4ca660ca96ca9f7433efb39cbb8a084d044732f05f3: Status 404 returned error can't find the container with id a9d4034f6d757b4d06bab4ca660ca96ca9f7433efb39cbb8a084d044732f05f3 Dec 03 11:10:23 crc kubenswrapper[4702]: I1203 11:10:23.389285 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-58bc8556f9-kwpsl"] Dec 03 11:10:23 crc kubenswrapper[4702]: I1203 11:10:23.555680 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a5cd2e99-2280-4e03-8889-ab105522d3b5","Type":"ContainerStarted","Data":"a9d4034f6d757b4d06bab4ca660ca96ca9f7433efb39cbb8a084d044732f05f3"} Dec 03 11:10:23 crc kubenswrapper[4702]: I1203 11:10:23.557697 4702 generic.go:334] "Generic (PLEG): container finished" podID="355ba2ac-2687-4315-85e9-6602fd79cfb2" containerID="3eb6a6c7aa7c8c02d428c9a646c53acc62e420d8d3d21b5acd452a17e45398b7" exitCode=0 Dec 03 11:10:23 crc kubenswrapper[4702]: I1203 11:10:23.557840 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ffz7k" event={"ID":"355ba2ac-2687-4315-85e9-6602fd79cfb2","Type":"ContainerDied","Data":"3eb6a6c7aa7c8c02d428c9a646c53acc62e420d8d3d21b5acd452a17e45398b7"} Dec 03 11:10:24 crc kubenswrapper[4702]: I1203 11:10:24.564288 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" event={"ID":"a2244316-61b9-4b19-afe9-514c0b989989","Type":"ContainerStarted","Data":"adb88995d068eed511c695e43d93fd20e3b880eb8aa226b2df3e41723f0a3a2b"} Dec 03 11:10:24 crc kubenswrapper[4702]: I1203 11:10:24.566586 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" event={"ID":"c7e7c2c8-5bc3-44f8-940e-cbde0e505e87","Type":"ContainerStarted","Data":"6723a5ea514645f7e3f287a23817568951e4f2514b97f189aa0f7ecaa5f46ef6"} Dec 03 11:10:24 crc kubenswrapper[4702]: I1203 11:10:24.568146 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ffz7k" event={"ID":"355ba2ac-2687-4315-85e9-6602fd79cfb2","Type":"ContainerStarted","Data":"26da62732cc69567502e2314ef6d7470f3f35dedf3754fde5029555cba5f8ed5"} Dec 03 11:10:24 crc kubenswrapper[4702]: I1203 11:10:24.569017 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" event={"ID":"b2347d45-1235-4ed0-9f48-6e0eb4c781f8","Type":"ContainerStarted","Data":"5ae64580f3879a65e77d0acc545bd393c65700cd1b6afc4962ea8519a0fecaf6"} Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.060606 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-8xbcf" podStartSLOduration=2.7533259389999998 podStartE2EDuration="5.060570889s" podCreationTimestamp="2025-12-03 11:10:20 +0000 UTC" firstStartedPulling="2025-12-03 11:10:21.71119266 +0000 UTC m=+405.547121124" lastFinishedPulling="2025-12-03 11:10:24.01843761 +0000 UTC m=+407.854366074" observedRunningTime="2025-12-03 11:10:24.793161757 +0000 UTC m=+408.629090221" watchObservedRunningTime="2025-12-03 11:10:25.060570889 +0000 UTC m=+408.896499353" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.067166 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-c8c89d49c-qrx84"] Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.068519 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.155015 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c8c89d49c-qrx84"] Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.178897 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-service-ca\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.179064 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-trusted-ca-bundle\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.179098 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-serving-cert\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.179138 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-oauth-serving-cert\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.179178 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-oauth-config\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.179335 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vczgp\" (UniqueName: \"kubernetes.io/projected/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-kube-api-access-vczgp\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.179506 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-config\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.281216 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-trusted-ca-bundle\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.281318 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-serving-cert\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.281358 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-oauth-serving-cert\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.281395 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-oauth-config\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.281461 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vczgp\" (UniqueName: \"kubernetes.io/projected/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-kube-api-access-vczgp\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.281512 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-config\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.281566 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-service-ca\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.282547 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-oauth-serving-cert\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.282617 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-service-ca\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.283375 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-config\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.283857 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-trusted-ca-bundle\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.300006 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-oauth-config\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.310045 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vczgp\" (UniqueName: \"kubernetes.io/projected/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-kube-api-access-vczgp\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.311330 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-serving-cert\") pod \"console-c8c89d49c-qrx84\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.392474 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.530871 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6975dd785d-5bvc2"] Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.531646 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.535170 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-4lmbq" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.535364 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.535800 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.535953 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.536039 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-8jb0c1t8end0" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.536523 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.546678 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6975dd785d-5bvc2"] Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.583315 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" event={"ID":"a2244316-61b9-4b19-afe9-514c0b989989","Type":"ContainerStarted","Data":"c8a6fa4a8f7a211863e5b776a9b7148bd5ddf09fd3bd7d689eaed90760bb8823"} Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.584617 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" event={"ID":"a2244316-61b9-4b19-afe9-514c0b989989","Type":"ContainerStarted","Data":"db8ca5576640ce208bfbb691e89b43e3980f7eab2da12fdeff6fb4f509993d3d"} Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.585313 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-metrics-server-audit-profiles\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.585386 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.585429 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-secret-metrics-client-certs\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.585463 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-client-ca-bundle\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.585531 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-audit-log\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.585572 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmvdh\" (UniqueName: \"kubernetes.io/projected/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-kube-api-access-zmvdh\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.585595 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-secret-metrics-server-tls\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.586296 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ffz7k" event={"ID":"355ba2ac-2687-4315-85e9-6602fd79cfb2","Type":"ContainerStarted","Data":"3f017e28849d95432c77b4d97c464fe78a65bc9e9df1851b892b4a50c59d0037"} Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.608858 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-mtldq" podStartSLOduration=3.350146045 podStartE2EDuration="5.608830334s" podCreationTimestamp="2025-12-03 11:10:20 +0000 UTC" firstStartedPulling="2025-12-03 11:10:21.755748475 +0000 UTC m=+405.591676939" lastFinishedPulling="2025-12-03 11:10:24.014432764 +0000 UTC m=+407.850361228" observedRunningTime="2025-12-03 11:10:25.60388856 +0000 UTC m=+409.439817034" watchObservedRunningTime="2025-12-03 11:10:25.608830334 +0000 UTC m=+409.444758798" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.628318 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ffz7k" podStartSLOduration=4.391464951 podStartE2EDuration="5.62828917s" podCreationTimestamp="2025-12-03 11:10:20 +0000 UTC" firstStartedPulling="2025-12-03 11:10:21.183566744 +0000 UTC m=+405.019495208" lastFinishedPulling="2025-12-03 11:10:22.420390963 +0000 UTC m=+406.256319427" observedRunningTime="2025-12-03 11:10:25.623999505 +0000 UTC m=+409.459927969" watchObservedRunningTime="2025-12-03 11:10:25.62828917 +0000 UTC m=+409.464217634" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.687616 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.687718 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-secret-metrics-client-certs\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.687811 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-client-ca-bundle\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.688055 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-audit-log\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.688137 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmvdh\" (UniqueName: \"kubernetes.io/projected/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-kube-api-access-zmvdh\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.688175 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-secret-metrics-server-tls\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.688275 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-metrics-server-audit-profiles\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.689965 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-audit-log\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.690825 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.691996 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-metrics-server-audit-profiles\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.694679 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-secret-metrics-server-tls\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.694979 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-secret-metrics-client-certs\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.697423 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-client-ca-bundle\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.719449 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmvdh\" (UniqueName: \"kubernetes.io/projected/56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff-kube-api-access-zmvdh\") pod \"metrics-server-6975dd785d-5bvc2\" (UID: \"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff\") " pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.865445 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.973227 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:10:25 crc kubenswrapper[4702]: I1203 11:10:25.973300 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.012670 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj"] Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.013871 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.020306 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.020724 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.035370 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj"] Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.212918 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29-monitoring-plugin-cert\") pod \"monitoring-plugin-74f4cdd6c8-9czrj\" (UID: \"7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29\") " pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.314599 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29-monitoring-plugin-cert\") pod \"monitoring-plugin-74f4cdd6c8-9czrj\" (UID: \"7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29\") " pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.319005 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29-monitoring-plugin-cert\") pod \"monitoring-plugin-74f4cdd6c8-9czrj\" (UID: \"7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29\") " pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 11:10:26 crc kubenswrapper[4702]: I1203 11:10:26.343785 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.028193 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c8c89d49c-qrx84"] Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.051290 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6975dd785d-5bvc2"] Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.063214 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj"] Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.098890 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.101475 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.104892 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.105070 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.105698 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-b8i8cf7bl6cod" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.105874 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.106002 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.106148 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.106249 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-k9xd4" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.106480 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.106654 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.106776 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.108150 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.113198 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.118058 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.120580 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287547 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287617 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287638 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287656 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-config\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287677 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287703 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287724 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287743 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287781 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-web-config\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287820 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqhkc\" (UniqueName: \"kubernetes.io/projected/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-kube-api-access-fqhkc\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287898 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287923 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287941 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287957 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.287989 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.288009 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.288028 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.288045 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-config-out\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.389805 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.389875 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.389915 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.389944 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-config-out\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.389989 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390035 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390054 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390071 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-config\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390088 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390112 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390135 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390155 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390200 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-web-config\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390220 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqhkc\" (UniqueName: \"kubernetes.io/projected/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-kube-api-access-fqhkc\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390246 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390273 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390288 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.390310 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.397451 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.399558 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.402482 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.403801 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.404655 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-web-config\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.409884 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.413347 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.413583 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.415025 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.415214 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.416152 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.417218 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-config-out\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.417553 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.417846 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.418629 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqhkc\" (UniqueName: \"kubernetes.io/projected/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-kube-api-access-fqhkc\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.419344 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-config\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.420047 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.429166 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7ad9fb7a-5481-4bb8-9a9b-99fda2021704-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"7ad9fb7a-5481-4bb8-9a9b-99fda2021704\") " pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.614244 4702 generic.go:334] "Generic (PLEG): container finished" podID="a5cd2e99-2280-4e03-8889-ab105522d3b5" containerID="e795b0a0f91b8f8d5a6716018bf68e29eba250ce834192c4e810a4278ef64a68" exitCode=0 Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.614343 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a5cd2e99-2280-4e03-8889-ab105522d3b5","Type":"ContainerDied","Data":"e795b0a0f91b8f8d5a6716018bf68e29eba250ce834192c4e810a4278ef64a68"} Dec 03 11:10:27 crc kubenswrapper[4702]: I1203 11:10:27.724548 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:28 crc kubenswrapper[4702]: W1203 11:10:28.161733 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf68ee0b8_cdfe_4aa0_9734_d910cab46b7a.slice/crio-f57948a1fefb1104e596ac466ed189802e9df9d622df774c3cf105fcbb713bf5 WatchSource:0}: Error finding container f57948a1fefb1104e596ac466ed189802e9df9d622df774c3cf105fcbb713bf5: Status 404 returned error can't find the container with id f57948a1fefb1104e596ac466ed189802e9df9d622df774c3cf105fcbb713bf5 Dec 03 11:10:28 crc kubenswrapper[4702]: W1203 11:10:28.163566 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b7fbc7_795b_4a66_b427_6d4fd0cdf0ff.slice/crio-b7bff97906bf8f37a1e3caf53ec93eb33d9a9887023adb2cbcdbe00f397b8540 WatchSource:0}: Error finding container b7bff97906bf8f37a1e3caf53ec93eb33d9a9887023adb2cbcdbe00f397b8540: Status 404 returned error can't find the container with id b7bff97906bf8f37a1e3caf53ec93eb33d9a9887023adb2cbcdbe00f397b8540 Dec 03 11:10:28 crc kubenswrapper[4702]: W1203 11:10:28.163913 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ff9bb87_9ede_4d63_a2f5_5d1c49b30c29.slice/crio-1e3742f206a1dcd9f98e3afac98739c641909ecd07efa42d00421c619a22acbe WatchSource:0}: Error finding container 1e3742f206a1dcd9f98e3afac98739c641909ecd07efa42d00421c619a22acbe: Status 404 returned error can't find the container with id 1e3742f206a1dcd9f98e3afac98739c641909ecd07efa42d00421c619a22acbe Dec 03 11:10:28 crc kubenswrapper[4702]: I1203 11:10:28.616004 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Dec 03 11:10:28 crc kubenswrapper[4702]: I1203 11:10:28.649931 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" event={"ID":"b2347d45-1235-4ed0-9f48-6e0eb4c781f8","Type":"ContainerStarted","Data":"0dbd2e8143cf78f6af52b15f3e7f7fdc6d4cb2b39bf5d756fec2473f92f07088"} Dec 03 11:10:28 crc kubenswrapper[4702]: I1203 11:10:28.657117 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" event={"ID":"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff","Type":"ContainerStarted","Data":"b7bff97906bf8f37a1e3caf53ec93eb33d9a9887023adb2cbcdbe00f397b8540"} Dec 03 11:10:28 crc kubenswrapper[4702]: I1203 11:10:28.657152 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c8c89d49c-qrx84" event={"ID":"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a","Type":"ContainerStarted","Data":"089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0"} Dec 03 11:10:28 crc kubenswrapper[4702]: I1203 11:10:28.657171 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c8c89d49c-qrx84" event={"ID":"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a","Type":"ContainerStarted","Data":"f57948a1fefb1104e596ac466ed189802e9df9d622df774c3cf105fcbb713bf5"} Dec 03 11:10:28 crc kubenswrapper[4702]: I1203 11:10:28.665058 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" event={"ID":"7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29","Type":"ContainerStarted","Data":"1e3742f206a1dcd9f98e3afac98739c641909ecd07efa42d00421c619a22acbe"} Dec 03 11:10:28 crc kubenswrapper[4702]: I1203 11:10:28.678380 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c8c89d49c-qrx84" podStartSLOduration=3.6783571889999997 podStartE2EDuration="3.678357189s" podCreationTimestamp="2025-12-03 11:10:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:10:28.678076461 +0000 UTC m=+412.514004945" watchObservedRunningTime="2025-12-03 11:10:28.678357189 +0000 UTC m=+412.514285653" Dec 03 11:10:29 crc kubenswrapper[4702]: I1203 11:10:29.701619 4702 generic.go:334] "Generic (PLEG): container finished" podID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerID="021e6ba326b93e52765dfbf26209e5029d7ba75021e5e75eabf59a8a49465c39" exitCode=0 Dec 03 11:10:29 crc kubenswrapper[4702]: I1203 11:10:29.701725 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerDied","Data":"021e6ba326b93e52765dfbf26209e5029d7ba75021e5e75eabf59a8a49465c39"} Dec 03 11:10:29 crc kubenswrapper[4702]: I1203 11:10:29.701800 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerStarted","Data":"8f1868786afd641ce19827d0aaea89b638da0e6be7bd5369b1fdcb8d39654053"} Dec 03 11:10:29 crc kubenswrapper[4702]: I1203 11:10:29.711429 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" event={"ID":"b2347d45-1235-4ed0-9f48-6e0eb4c781f8","Type":"ContainerStarted","Data":"c89e1a4c9001c0424f139b835b6611bb249470c8d0b1cf51a80bea9e10f25ab4"} Dec 03 11:10:29 crc kubenswrapper[4702]: I1203 11:10:29.711491 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" event={"ID":"b2347d45-1235-4ed0-9f48-6e0eb4c781f8","Type":"ContainerStarted","Data":"2c38c81a749375ecd3e023c9687ab98a9ab95f08bbfab5949e14bdc39dd1e2a7"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.739930 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" event={"ID":"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff","Type":"ContainerStarted","Data":"e36df22670a6d359cb2ddc47ec0abc22c2b0001877dbfb6610c77f76e758779e"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.751148 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a5cd2e99-2280-4e03-8889-ab105522d3b5","Type":"ContainerStarted","Data":"c787565523dea1255f064ff0308649c686311460f74cba76dd1a251f47c16635"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.751242 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a5cd2e99-2280-4e03-8889-ab105522d3b5","Type":"ContainerStarted","Data":"216627de9a1de0741042058287596a353ba927dd0cc659b2b5e257b701fdf6b4"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.751262 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a5cd2e99-2280-4e03-8889-ab105522d3b5","Type":"ContainerStarted","Data":"c7d7ebad3533787eea049305d23e30890e67273acc7dc5ed9e673edadfb44895"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.751305 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a5cd2e99-2280-4e03-8889-ab105522d3b5","Type":"ContainerStarted","Data":"e9f3fd065b3a6f1f57d5df2462f4c34b2e1c346ce43b460123aa3ce58592086b"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.754145 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" event={"ID":"7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29","Type":"ContainerStarted","Data":"532f31ed8eca608c60c8359982544897cf7039324daf92e0d7b059675685a5a9"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.754349 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.761959 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" event={"ID":"b2347d45-1235-4ed0-9f48-6e0eb4c781f8","Type":"ContainerStarted","Data":"67f4b18a3d32ecf37a12c91819d6459c1915f253cc5cb9fa7859e583d83cd0eb"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.762030 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" event={"ID":"b2347d45-1235-4ed0-9f48-6e0eb4c781f8","Type":"ContainerStarted","Data":"4bbc27a6332d35544074b6fda1a341c9534de3f1f5419e6f0203236899472dc4"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.762045 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" event={"ID":"b2347d45-1235-4ed0-9f48-6e0eb4c781f8","Type":"ContainerStarted","Data":"2286d34fa86ea4d74759d508c974f837eb523c6022a623e1afffd10845437cfb"} Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.762536 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.766293 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.770197 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" podStartSLOduration=4.168317059 podStartE2EDuration="7.770160897s" podCreationTimestamp="2025-12-03 11:10:25 +0000 UTC" firstStartedPulling="2025-12-03 11:10:28.168583752 +0000 UTC m=+412.004512216" lastFinishedPulling="2025-12-03 11:10:31.77042759 +0000 UTC m=+415.606356054" observedRunningTime="2025-12-03 11:10:32.766189951 +0000 UTC m=+416.602118435" watchObservedRunningTime="2025-12-03 11:10:32.770160897 +0000 UTC m=+416.606089361" Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.801858 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" podStartSLOduration=4.19996211 podStartE2EDuration="7.801813977s" podCreationTimestamp="2025-12-03 11:10:25 +0000 UTC" firstStartedPulling="2025-12-03 11:10:28.168700336 +0000 UTC m=+412.004628800" lastFinishedPulling="2025-12-03 11:10:31.770552193 +0000 UTC m=+415.606480667" observedRunningTime="2025-12-03 11:10:32.793260898 +0000 UTC m=+416.629189372" watchObservedRunningTime="2025-12-03 11:10:32.801813977 +0000 UTC m=+416.637742441" Dec 03 11:10:32 crc kubenswrapper[4702]: I1203 11:10:32.844923 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" podStartSLOduration=3.005336024 podStartE2EDuration="10.844895019s" podCreationTimestamp="2025-12-03 11:10:22 +0000 UTC" firstStartedPulling="2025-12-03 11:10:24.001884909 +0000 UTC m=+407.837813373" lastFinishedPulling="2025-12-03 11:10:31.841443904 +0000 UTC m=+415.677372368" observedRunningTime="2025-12-03 11:10:32.834326082 +0000 UTC m=+416.670254546" watchObservedRunningTime="2025-12-03 11:10:32.844895019 +0000 UTC m=+416.680823483" Dec 03 11:10:33 crc kubenswrapper[4702]: I1203 11:10:33.776797 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a5cd2e99-2280-4e03-8889-ab105522d3b5","Type":"ContainerStarted","Data":"b9cca03d41b745c3fc72a3723220f6bbc713eec5d5c53bcde9ec7bb7636e86f5"} Dec 03 11:10:33 crc kubenswrapper[4702]: I1203 11:10:33.776878 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a5cd2e99-2280-4e03-8889-ab105522d3b5","Type":"ContainerStarted","Data":"3dc03a4cfb9987004417158b4303ff64c78bd525a8527392304ab4604beacb76"} Dec 03 11:10:33 crc kubenswrapper[4702]: I1203 11:10:33.789609 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" Dec 03 11:10:33 crc kubenswrapper[4702]: I1203 11:10:33.816301 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.880800824 podStartE2EDuration="12.816258131s" podCreationTimestamp="2025-12-03 11:10:21 +0000 UTC" firstStartedPulling="2025-12-03 11:10:22.835030194 +0000 UTC m=+406.670958658" lastFinishedPulling="2025-12-03 11:10:31.770487501 +0000 UTC m=+415.606415965" observedRunningTime="2025-12-03 11:10:33.813086769 +0000 UTC m=+417.649015243" watchObservedRunningTime="2025-12-03 11:10:33.816258131 +0000 UTC m=+417.652186595" Dec 03 11:10:35 crc kubenswrapper[4702]: I1203 11:10:35.394274 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:35 crc kubenswrapper[4702]: I1203 11:10:35.396932 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:35 crc kubenswrapper[4702]: I1203 11:10:35.407577 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:35 crc kubenswrapper[4702]: I1203 11:10:35.801233 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:10:35 crc kubenswrapper[4702]: I1203 11:10:35.863130 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ccdtg"] Dec 03 11:10:36 crc kubenswrapper[4702]: I1203 11:10:36.833811 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerStarted","Data":"c44e18cce3f9eb37bc30bd66b99a5ca70e480154d92c0b5a04cd3a76d807386e"} Dec 03 11:10:36 crc kubenswrapper[4702]: I1203 11:10:36.834332 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerStarted","Data":"bf23a920921565d9dfebc529cd5102cf11aa7638cef0ed31acd73f2b0286300d"} Dec 03 11:10:36 crc kubenswrapper[4702]: I1203 11:10:36.834346 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerStarted","Data":"e291f25adbf5dcd53038c22043006bc14a0984b96ee6d736e32a4c32db3a826a"} Dec 03 11:10:36 crc kubenswrapper[4702]: I1203 11:10:36.834358 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerStarted","Data":"90ed8476f363350dba5acc13732c8116320255eb71bde5a9586fd2e35d06f833"} Dec 03 11:10:36 crc kubenswrapper[4702]: I1203 11:10:36.834368 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerStarted","Data":"57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d"} Dec 03 11:10:37 crc kubenswrapper[4702]: I1203 11:10:37.845154 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerStarted","Data":"1f5b258f0fb5fc6637e7c110d103d2cb1da239d3fc25ac813ac72d7330418d5d"} Dec 03 11:10:37 crc kubenswrapper[4702]: I1203 11:10:37.884219 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.836663236 podStartE2EDuration="10.884193035s" podCreationTimestamp="2025-12-03 11:10:27 +0000 UTC" firstStartedPulling="2025-12-03 11:10:29.706403679 +0000 UTC m=+413.542332143" lastFinishedPulling="2025-12-03 11:10:35.753933478 +0000 UTC m=+419.589861942" observedRunningTime="2025-12-03 11:10:37.879804567 +0000 UTC m=+421.715733031" watchObservedRunningTime="2025-12-03 11:10:37.884193035 +0000 UTC m=+421.720121499" Dec 03 11:10:42 crc kubenswrapper[4702]: I1203 11:10:42.725868 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:10:45 crc kubenswrapper[4702]: I1203 11:10:45.865634 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:45 crc kubenswrapper[4702]: I1203 11:10:45.866193 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:10:55 crc kubenswrapper[4702]: I1203 11:10:55.907799 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:10:55 crc kubenswrapper[4702]: I1203 11:10:55.908785 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:10:55 crc kubenswrapper[4702]: I1203 11:10:55.908882 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:10:55 crc kubenswrapper[4702]: I1203 11:10:55.910783 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1fc7c6ca3be1bf16313736990d6f512ba61818b84fe4574e5b246fb1305c4999"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:10:55 crc kubenswrapper[4702]: I1203 11:10:55.910894 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://1fc7c6ca3be1bf16313736990d6f512ba61818b84fe4574e5b246fb1305c4999" gracePeriod=600 Dec 03 11:10:58 crc kubenswrapper[4702]: I1203 11:10:58.011367 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="1fc7c6ca3be1bf16313736990d6f512ba61818b84fe4574e5b246fb1305c4999" exitCode=0 Dec 03 11:10:58 crc kubenswrapper[4702]: I1203 11:10:58.011927 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"1fc7c6ca3be1bf16313736990d6f512ba61818b84fe4574e5b246fb1305c4999"} Dec 03 11:10:58 crc kubenswrapper[4702]: I1203 11:10:58.011991 4702 scope.go:117] "RemoveContainer" containerID="31e120ed1aa40598cd350fd2870e06bfea7350d701f9519839c58fa7482f41a4" Dec 03 11:11:00 crc kubenswrapper[4702]: I1203 11:11:00.918053 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ccdtg" podUID="761a509d-5cb8-4506-901c-614a7d633d39" containerName="console" containerID="cri-o://f6c837c7d8947d2404a52599d5b716ddf7436fae795ff90907d36691abed95d2" gracePeriod=15 Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.047553 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"58c8f87f05ee64e1d869e097387b492d719b507df8e2586f56cddcf068af149d"} Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.050315 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ccdtg_761a509d-5cb8-4506-901c-614a7d633d39/console/0.log" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.050394 4702 generic.go:334] "Generic (PLEG): container finished" podID="761a509d-5cb8-4506-901c-614a7d633d39" containerID="f6c837c7d8947d2404a52599d5b716ddf7436fae795ff90907d36691abed95d2" exitCode=2 Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.050449 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ccdtg" event={"ID":"761a509d-5cb8-4506-901c-614a7d633d39","Type":"ContainerDied","Data":"f6c837c7d8947d2404a52599d5b716ddf7436fae795ff90907d36691abed95d2"} Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.314229 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ccdtg_761a509d-5cb8-4506-901c-614a7d633d39/console/0.log" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.314335 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.493055 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-serving-cert\") pod \"761a509d-5cb8-4506-901c-614a7d633d39\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.493154 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-trusted-ca-bundle\") pod \"761a509d-5cb8-4506-901c-614a7d633d39\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.493180 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-oauth-serving-cert\") pod \"761a509d-5cb8-4506-901c-614a7d633d39\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.493273 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-console-config\") pod \"761a509d-5cb8-4506-901c-614a7d633d39\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.493311 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rb9v\" (UniqueName: \"kubernetes.io/projected/761a509d-5cb8-4506-901c-614a7d633d39-kube-api-access-5rb9v\") pod \"761a509d-5cb8-4506-901c-614a7d633d39\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.493340 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-oauth-config\") pod \"761a509d-5cb8-4506-901c-614a7d633d39\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.493405 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-service-ca\") pod \"761a509d-5cb8-4506-901c-614a7d633d39\" (UID: \"761a509d-5cb8-4506-901c-614a7d633d39\") " Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.494216 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "761a509d-5cb8-4506-901c-614a7d633d39" (UID: "761a509d-5cb8-4506-901c-614a7d633d39"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.494256 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-service-ca" (OuterVolumeSpecName: "service-ca") pod "761a509d-5cb8-4506-901c-614a7d633d39" (UID: "761a509d-5cb8-4506-901c-614a7d633d39"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.494238 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "761a509d-5cb8-4506-901c-614a7d633d39" (UID: "761a509d-5cb8-4506-901c-614a7d633d39"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.494418 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-console-config" (OuterVolumeSpecName: "console-config") pod "761a509d-5cb8-4506-901c-614a7d633d39" (UID: "761a509d-5cb8-4506-901c-614a7d633d39"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.500117 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "761a509d-5cb8-4506-901c-614a7d633d39" (UID: "761a509d-5cb8-4506-901c-614a7d633d39"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.500259 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761a509d-5cb8-4506-901c-614a7d633d39-kube-api-access-5rb9v" (OuterVolumeSpecName: "kube-api-access-5rb9v") pod "761a509d-5cb8-4506-901c-614a7d633d39" (UID: "761a509d-5cb8-4506-901c-614a7d633d39"). InnerVolumeSpecName "kube-api-access-5rb9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.500539 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "761a509d-5cb8-4506-901c-614a7d633d39" (UID: "761a509d-5cb8-4506-901c-614a7d633d39"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.595504 4702 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.595610 4702 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.595627 4702 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/761a509d-5cb8-4506-901c-614a7d633d39-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.595643 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.595658 4702 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.595676 4702 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/761a509d-5cb8-4506-901c-614a7d633d39-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:11:01 crc kubenswrapper[4702]: I1203 11:11:01.595688 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rb9v\" (UniqueName: \"kubernetes.io/projected/761a509d-5cb8-4506-901c-614a7d633d39-kube-api-access-5rb9v\") on node \"crc\" DevicePath \"\"" Dec 03 11:11:02 crc kubenswrapper[4702]: I1203 11:11:02.060076 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ccdtg_761a509d-5cb8-4506-901c-614a7d633d39/console/0.log" Dec 03 11:11:02 crc kubenswrapper[4702]: I1203 11:11:02.060567 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ccdtg" event={"ID":"761a509d-5cb8-4506-901c-614a7d633d39","Type":"ContainerDied","Data":"81b49c145d916548d1b7a2037d78495ddc381bc2e200abb5a89d121d0f50caf2"} Dec 03 11:11:02 crc kubenswrapper[4702]: I1203 11:11:02.060627 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ccdtg" Dec 03 11:11:02 crc kubenswrapper[4702]: I1203 11:11:02.060704 4702 scope.go:117] "RemoveContainer" containerID="f6c837c7d8947d2404a52599d5b716ddf7436fae795ff90907d36691abed95d2" Dec 03 11:11:02 crc kubenswrapper[4702]: I1203 11:11:02.103875 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ccdtg"] Dec 03 11:11:02 crc kubenswrapper[4702]: I1203 11:11:02.109181 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ccdtg"] Dec 03 11:11:02 crc kubenswrapper[4702]: I1203 11:11:02.937847 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761a509d-5cb8-4506-901c-614a7d633d39" path="/var/lib/kubelet/pods/761a509d-5cb8-4506-901c-614a7d633d39/volumes" Dec 03 11:11:05 crc kubenswrapper[4702]: I1203 11:11:05.873528 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:11:05 crc kubenswrapper[4702]: I1203 11:11:05.882050 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 11:11:27 crc kubenswrapper[4702]: I1203 11:11:27.726481 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:11:27 crc kubenswrapper[4702]: I1203 11:11:27.771323 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:11:28 crc kubenswrapper[4702]: I1203 11:11:28.355033 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.575070 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d5fd7569-jhhw8"] Dec 03 11:12:19 crc kubenswrapper[4702]: E1203 11:12:19.575983 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761a509d-5cb8-4506-901c-614a7d633d39" containerName="console" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.575998 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="761a509d-5cb8-4506-901c-614a7d633d39" containerName="console" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.576136 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="761a509d-5cb8-4506-901c-614a7d633d39" containerName="console" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.576667 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.598497 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d5fd7569-jhhw8"] Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.727423 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-trusted-ca-bundle\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.728320 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-console-config\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.728397 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-service-ca\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.728449 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-serving-cert\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.728481 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8ffq\" (UniqueName: \"kubernetes.io/projected/f940bfbe-1e27-433f-836b-7b542814b39d-kube-api-access-f8ffq\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.728783 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-oauth-config\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.728928 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-oauth-serving-cert\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.831378 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-trusted-ca-bundle\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.831491 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-console-config\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.831554 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-service-ca\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.831597 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-serving-cert\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.831679 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8ffq\" (UniqueName: \"kubernetes.io/projected/f940bfbe-1e27-433f-836b-7b542814b39d-kube-api-access-f8ffq\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.831727 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-oauth-config\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.831780 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-oauth-serving-cert\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.833252 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-oauth-serving-cert\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.833966 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-trusted-ca-bundle\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.833979 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-service-ca\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.834486 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-console-config\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.841885 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-serving-cert\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.848256 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-oauth-config\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.855864 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8ffq\" (UniqueName: \"kubernetes.io/projected/f940bfbe-1e27-433f-836b-7b542814b39d-kube-api-access-f8ffq\") pod \"console-64d5fd7569-jhhw8\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:19 crc kubenswrapper[4702]: I1203 11:12:19.903060 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:20 crc kubenswrapper[4702]: I1203 11:12:20.182454 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d5fd7569-jhhw8"] Dec 03 11:12:20 crc kubenswrapper[4702]: I1203 11:12:20.806865 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d5fd7569-jhhw8" event={"ID":"f940bfbe-1e27-433f-836b-7b542814b39d","Type":"ContainerStarted","Data":"f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6"} Dec 03 11:12:20 crc kubenswrapper[4702]: I1203 11:12:20.806954 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d5fd7569-jhhw8" event={"ID":"f940bfbe-1e27-433f-836b-7b542814b39d","Type":"ContainerStarted","Data":"5da2e7d2ef24badc1d4011e3eac9c7e3516f7b909434426cc0ac867740089c96"} Dec 03 11:12:20 crc kubenswrapper[4702]: I1203 11:12:20.840529 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d5fd7569-jhhw8" podStartSLOduration=1.840488254 podStartE2EDuration="1.840488254s" podCreationTimestamp="2025-12-03 11:12:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:12:20.831563468 +0000 UTC m=+524.667491942" watchObservedRunningTime="2025-12-03 11:12:20.840488254 +0000 UTC m=+524.676416718" Dec 03 11:12:29 crc kubenswrapper[4702]: I1203 11:12:29.904696 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:29 crc kubenswrapper[4702]: I1203 11:12:29.905265 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:29 crc kubenswrapper[4702]: I1203 11:12:29.909977 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:30 crc kubenswrapper[4702]: I1203 11:12:30.884959 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:12:30 crc kubenswrapper[4702]: I1203 11:12:30.974019 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c8c89d49c-qrx84"] Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.044681 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-c8c89d49c-qrx84" podUID="f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" containerName="console" containerID="cri-o://089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0" gracePeriod=15 Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.401223 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c8c89d49c-qrx84_f68ee0b8-cdfe-4aa0-9734-d910cab46b7a/console/0.log" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.401326 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.431625 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-oauth-config\") pod \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.431739 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-config\") pod \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.431788 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-serving-cert\") pod \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.431940 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vczgp\" (UniqueName: \"kubernetes.io/projected/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-kube-api-access-vczgp\") pod \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.432003 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-service-ca\") pod \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.432044 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-oauth-serving-cert\") pod \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.432085 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-trusted-ca-bundle\") pod \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\" (UID: \"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a\") " Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.433512 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" (UID: "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.433921 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-config" (OuterVolumeSpecName: "console-config") pod "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" (UID: "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.433954 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-service-ca" (OuterVolumeSpecName: "service-ca") pod "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" (UID: "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.435304 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" (UID: "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.439256 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-kube-api-access-vczgp" (OuterVolumeSpecName: "kube-api-access-vczgp") pod "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" (UID: "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a"). InnerVolumeSpecName "kube-api-access-vczgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.440944 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" (UID: "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.443982 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" (UID: "f68ee0b8-cdfe-4aa0-9734-d910cab46b7a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.534585 4702 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.534668 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.534680 4702 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.534689 4702 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.534700 4702 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.534714 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vczgp\" (UniqueName: \"kubernetes.io/projected/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-kube-api-access-vczgp\") on node \"crc\" DevicePath \"\"" Dec 03 11:12:56 crc kubenswrapper[4702]: I1203 11:12:56.534727 4702 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.109563 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c8c89d49c-qrx84_f68ee0b8-cdfe-4aa0-9734-d910cab46b7a/console/0.log" Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.109635 4702 generic.go:334] "Generic (PLEG): container finished" podID="f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" containerID="089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0" exitCode=2 Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.109680 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c8c89d49c-qrx84" event={"ID":"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a","Type":"ContainerDied","Data":"089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0"} Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.109726 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c8c89d49c-qrx84" event={"ID":"f68ee0b8-cdfe-4aa0-9734-d910cab46b7a","Type":"ContainerDied","Data":"f57948a1fefb1104e596ac466ed189802e9df9d622df774c3cf105fcbb713bf5"} Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.109772 4702 scope.go:117] "RemoveContainer" containerID="089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0" Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.109792 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c8c89d49c-qrx84" Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.136539 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c8c89d49c-qrx84"] Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.141028 4702 scope.go:117] "RemoveContainer" containerID="089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0" Dec 03 11:12:57 crc kubenswrapper[4702]: E1203 11:12:57.141821 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0\": container with ID starting with 089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0 not found: ID does not exist" containerID="089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0" Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.141876 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0"} err="failed to get container status \"089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0\": rpc error: code = NotFound desc = could not find container \"089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0\": container with ID starting with 089aafbb59b346f54d2794308f8fb00a8fd248eea4340808a0c8eab008b8e4b0 not found: ID does not exist" Dec 03 11:12:57 crc kubenswrapper[4702]: I1203 11:12:57.142118 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c8c89d49c-qrx84"] Dec 03 11:12:58 crc kubenswrapper[4702]: I1203 11:12:58.939408 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" path="/var/lib/kubelet/pods/f68ee0b8-cdfe-4aa0-9734-d910cab46b7a/volumes" Dec 03 11:13:25 crc kubenswrapper[4702]: I1203 11:13:25.908918 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:13:25 crc kubenswrapper[4702]: I1203 11:13:25.909883 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:13:55 crc kubenswrapper[4702]: I1203 11:13:55.908337 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:13:55 crc kubenswrapper[4702]: I1203 11:13:55.909245 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:14:25 crc kubenswrapper[4702]: I1203 11:14:25.907974 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:14:25 crc kubenswrapper[4702]: I1203 11:14:25.908854 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:14:25 crc kubenswrapper[4702]: I1203 11:14:25.908924 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:14:25 crc kubenswrapper[4702]: I1203 11:14:25.909902 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58c8f87f05ee64e1d869e097387b492d719b507df8e2586f56cddcf068af149d"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:14:25 crc kubenswrapper[4702]: I1203 11:14:25.909975 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://58c8f87f05ee64e1d869e097387b492d719b507df8e2586f56cddcf068af149d" gracePeriod=600 Dec 03 11:14:26 crc kubenswrapper[4702]: I1203 11:14:26.912088 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="58c8f87f05ee64e1d869e097387b492d719b507df8e2586f56cddcf068af149d" exitCode=0 Dec 03 11:14:26 crc kubenswrapper[4702]: I1203 11:14:26.912282 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"58c8f87f05ee64e1d869e097387b492d719b507df8e2586f56cddcf068af149d"} Dec 03 11:14:26 crc kubenswrapper[4702]: I1203 11:14:26.912849 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"77511b1fe86d0b6a8fc57584220c3eed3f6f18f178f53eb56a078f9c63c86b1e"} Dec 03 11:14:26 crc kubenswrapper[4702]: I1203 11:14:26.912882 4702 scope.go:117] "RemoveContainer" containerID="1fc7c6ca3be1bf16313736990d6f512ba61818b84fe4574e5b246fb1305c4999" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.179274 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6"] Dec 03 11:15:00 crc kubenswrapper[4702]: E1203 11:15:00.180525 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" containerName="console" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.180545 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" containerName="console" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.180670 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f68ee0b8-cdfe-4aa0-9734-d910cab46b7a" containerName="console" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.181547 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.185613 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.185670 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.198307 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6"] Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.281551 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6b4602-6381-43c2-bc7b-df1f2ade0083-config-volume\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.281719 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8kd\" (UniqueName: \"kubernetes.io/projected/da6b4602-6381-43c2-bc7b-df1f2ade0083-kube-api-access-rx8kd\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.281807 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6b4602-6381-43c2-bc7b-df1f2ade0083-secret-volume\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.383000 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6b4602-6381-43c2-bc7b-df1f2ade0083-config-volume\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.383119 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8kd\" (UniqueName: \"kubernetes.io/projected/da6b4602-6381-43c2-bc7b-df1f2ade0083-kube-api-access-rx8kd\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.383176 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6b4602-6381-43c2-bc7b-df1f2ade0083-secret-volume\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.384461 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6b4602-6381-43c2-bc7b-df1f2ade0083-config-volume\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.392857 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6b4602-6381-43c2-bc7b-df1f2ade0083-secret-volume\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.403376 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8kd\" (UniqueName: \"kubernetes.io/projected/da6b4602-6381-43c2-bc7b-df1f2ade0083-kube-api-access-rx8kd\") pod \"collect-profiles-29412675-hzxt6\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.510280 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:00 crc kubenswrapper[4702]: I1203 11:15:00.744277 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6"] Dec 03 11:15:01 crc kubenswrapper[4702]: I1203 11:15:01.158564 4702 generic.go:334] "Generic (PLEG): container finished" podID="da6b4602-6381-43c2-bc7b-df1f2ade0083" containerID="7d50aa7b81f8a3da6bb3a35c53f74ba623b7253fe88bdba60e529c3182452457" exitCode=0 Dec 03 11:15:01 crc kubenswrapper[4702]: I1203 11:15:01.158811 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" event={"ID":"da6b4602-6381-43c2-bc7b-df1f2ade0083","Type":"ContainerDied","Data":"7d50aa7b81f8a3da6bb3a35c53f74ba623b7253fe88bdba60e529c3182452457"} Dec 03 11:15:01 crc kubenswrapper[4702]: I1203 11:15:01.159429 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" event={"ID":"da6b4602-6381-43c2-bc7b-df1f2ade0083","Type":"ContainerStarted","Data":"26c951a0a2a6290f2c30456e938c1891afd8bcc4fb2fec041f8a3e9dad85b160"} Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.407327 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.522430 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6b4602-6381-43c2-bc7b-df1f2ade0083-secret-volume\") pod \"da6b4602-6381-43c2-bc7b-df1f2ade0083\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.522570 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx8kd\" (UniqueName: \"kubernetes.io/projected/da6b4602-6381-43c2-bc7b-df1f2ade0083-kube-api-access-rx8kd\") pod \"da6b4602-6381-43c2-bc7b-df1f2ade0083\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.522627 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6b4602-6381-43c2-bc7b-df1f2ade0083-config-volume\") pod \"da6b4602-6381-43c2-bc7b-df1f2ade0083\" (UID: \"da6b4602-6381-43c2-bc7b-df1f2ade0083\") " Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.523362 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da6b4602-6381-43c2-bc7b-df1f2ade0083-config-volume" (OuterVolumeSpecName: "config-volume") pod "da6b4602-6381-43c2-bc7b-df1f2ade0083" (UID: "da6b4602-6381-43c2-bc7b-df1f2ade0083"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.523676 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da6b4602-6381-43c2-bc7b-df1f2ade0083-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.530637 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da6b4602-6381-43c2-bc7b-df1f2ade0083-kube-api-access-rx8kd" (OuterVolumeSpecName: "kube-api-access-rx8kd") pod "da6b4602-6381-43c2-bc7b-df1f2ade0083" (UID: "da6b4602-6381-43c2-bc7b-df1f2ade0083"). InnerVolumeSpecName "kube-api-access-rx8kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.534096 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da6b4602-6381-43c2-bc7b-df1f2ade0083-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "da6b4602-6381-43c2-bc7b-df1f2ade0083" (UID: "da6b4602-6381-43c2-bc7b-df1f2ade0083"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.624814 4702 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/da6b4602-6381-43c2-bc7b-df1f2ade0083-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:02 crc kubenswrapper[4702]: I1203 11:15:02.624886 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx8kd\" (UniqueName: \"kubernetes.io/projected/da6b4602-6381-43c2-bc7b-df1f2ade0083-kube-api-access-rx8kd\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:03 crc kubenswrapper[4702]: I1203 11:15:03.176417 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" event={"ID":"da6b4602-6381-43c2-bc7b-df1f2ade0083","Type":"ContainerDied","Data":"26c951a0a2a6290f2c30456e938c1891afd8bcc4fb2fec041f8a3e9dad85b160"} Dec 03 11:15:03 crc kubenswrapper[4702]: I1203 11:15:03.176480 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26c951a0a2a6290f2c30456e938c1891afd8bcc4fb2fec041f8a3e9dad85b160" Dec 03 11:15:03 crc kubenswrapper[4702]: I1203 11:15:03.176605 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6" Dec 03 11:15:21 crc kubenswrapper[4702]: I1203 11:15:21.089911 4702 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.639158 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9"] Dec 03 11:15:29 crc kubenswrapper[4702]: E1203 11:15:29.640285 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da6b4602-6381-43c2-bc7b-df1f2ade0083" containerName="collect-profiles" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.640301 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="da6b4602-6381-43c2-bc7b-df1f2ade0083" containerName="collect-profiles" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.640427 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="da6b4602-6381-43c2-bc7b-df1f2ade0083" containerName="collect-profiles" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.641586 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.646354 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.658116 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9"] Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.815184 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m99x5\" (UniqueName: \"kubernetes.io/projected/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-kube-api-access-m99x5\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.815673 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.815784 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.917138 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.917222 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m99x5\" (UniqueName: \"kubernetes.io/projected/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-kube-api-access-m99x5\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.917469 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.917948 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.918074 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.941299 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m99x5\" (UniqueName: \"kubernetes.io/projected/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-kube-api-access-m99x5\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:29 crc kubenswrapper[4702]: I1203 11:15:29.970160 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:30 crc kubenswrapper[4702]: I1203 11:15:30.230982 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9"] Dec 03 11:15:30 crc kubenswrapper[4702]: I1203 11:15:30.362222 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" event={"ID":"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd","Type":"ContainerStarted","Data":"dc8433487d1e4b25761edb7991ad5010ebb5eacc2a3a822eeafdd6587d146227"} Dec 03 11:15:31 crc kubenswrapper[4702]: I1203 11:15:31.371854 4702 generic.go:334] "Generic (PLEG): container finished" podID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerID="0e9c4c8d3dd4b8cc3eb6375afee1b725396ee0da96d18c2a8fcba644efd8c489" exitCode=0 Dec 03 11:15:31 crc kubenswrapper[4702]: I1203 11:15:31.371918 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" event={"ID":"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd","Type":"ContainerDied","Data":"0e9c4c8d3dd4b8cc3eb6375afee1b725396ee0da96d18c2a8fcba644efd8c489"} Dec 03 11:15:31 crc kubenswrapper[4702]: I1203 11:15:31.374567 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:15:31 crc kubenswrapper[4702]: I1203 11:15:31.986550 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zggrh"] Dec 03 11:15:31 crc kubenswrapper[4702]: I1203 11:15:31.988786 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.001548 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zggrh"] Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.110704 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-catalog-content\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.111554 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxw5s\" (UniqueName: \"kubernetes.io/projected/b502f4ab-a954-46f4-a130-e5205b820f09-kube-api-access-qxw5s\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.111639 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-utilities\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.213332 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-catalog-content\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.213450 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxw5s\" (UniqueName: \"kubernetes.io/projected/b502f4ab-a954-46f4-a130-e5205b820f09-kube-api-access-qxw5s\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.213513 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-utilities\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.214077 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-utilities\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.214142 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-catalog-content\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.241254 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxw5s\" (UniqueName: \"kubernetes.io/projected/b502f4ab-a954-46f4-a130-e5205b820f09-kube-api-access-qxw5s\") pod \"redhat-operators-zggrh\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.309410 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:32 crc kubenswrapper[4702]: I1203 11:15:32.643306 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zggrh"] Dec 03 11:15:33 crc kubenswrapper[4702]: E1203 11:15:33.060675 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb502f4ab_a954_46f4_a130_e5205b820f09.slice/crio-b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:15:33 crc kubenswrapper[4702]: I1203 11:15:33.390350 4702 generic.go:334] "Generic (PLEG): container finished" podID="b502f4ab-a954-46f4-a130-e5205b820f09" containerID="b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476" exitCode=0 Dec 03 11:15:33 crc kubenswrapper[4702]: I1203 11:15:33.390439 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zggrh" event={"ID":"b502f4ab-a954-46f4-a130-e5205b820f09","Type":"ContainerDied","Data":"b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476"} Dec 03 11:15:33 crc kubenswrapper[4702]: I1203 11:15:33.390870 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zggrh" event={"ID":"b502f4ab-a954-46f4-a130-e5205b820f09","Type":"ContainerStarted","Data":"72324fef7a60410683165b8fbfdac61ba37cf9ee3a8502db7a4c76a9673963c5"} Dec 03 11:15:34 crc kubenswrapper[4702]: I1203 11:15:34.411349 4702 generic.go:334] "Generic (PLEG): container finished" podID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerID="4bc544837271f2dd548c0452c7d8b1c445238aea17bec0b762c098ae990d9dbe" exitCode=0 Dec 03 11:15:34 crc kubenswrapper[4702]: I1203 11:15:34.411438 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" event={"ID":"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd","Type":"ContainerDied","Data":"4bc544837271f2dd548c0452c7d8b1c445238aea17bec0b762c098ae990d9dbe"} Dec 03 11:15:35 crc kubenswrapper[4702]: I1203 11:15:35.422749 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zggrh" event={"ID":"b502f4ab-a954-46f4-a130-e5205b820f09","Type":"ContainerStarted","Data":"bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4"} Dec 03 11:15:35 crc kubenswrapper[4702]: I1203 11:15:35.427459 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" event={"ID":"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd","Type":"ContainerStarted","Data":"6a1e12b3710fe1a333a6312a0b74e60eb62e3134f9fd964e11f7ed80c6323e6d"} Dec 03 11:15:35 crc kubenswrapper[4702]: I1203 11:15:35.500192 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" podStartSLOduration=4.270944903 podStartE2EDuration="6.500161385s" podCreationTimestamp="2025-12-03 11:15:29 +0000 UTC" firstStartedPulling="2025-12-03 11:15:31.374234737 +0000 UTC m=+715.210163201" lastFinishedPulling="2025-12-03 11:15:33.603451219 +0000 UTC m=+717.439379683" observedRunningTime="2025-12-03 11:15:35.49469631 +0000 UTC m=+719.330624774" watchObservedRunningTime="2025-12-03 11:15:35.500161385 +0000 UTC m=+719.336089849" Dec 03 11:15:36 crc kubenswrapper[4702]: I1203 11:15:36.440783 4702 generic.go:334] "Generic (PLEG): container finished" podID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerID="6a1e12b3710fe1a333a6312a0b74e60eb62e3134f9fd964e11f7ed80c6323e6d" exitCode=0 Dec 03 11:15:36 crc kubenswrapper[4702]: I1203 11:15:36.440842 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" event={"ID":"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd","Type":"ContainerDied","Data":"6a1e12b3710fe1a333a6312a0b74e60eb62e3134f9fd964e11f7ed80c6323e6d"} Dec 03 11:15:37 crc kubenswrapper[4702]: I1203 11:15:37.450195 4702 generic.go:334] "Generic (PLEG): container finished" podID="b502f4ab-a954-46f4-a130-e5205b820f09" containerID="bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4" exitCode=0 Dec 03 11:15:37 crc kubenswrapper[4702]: I1203 11:15:37.450323 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zggrh" event={"ID":"b502f4ab-a954-46f4-a130-e5205b820f09","Type":"ContainerDied","Data":"bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4"} Dec 03 11:15:37 crc kubenswrapper[4702]: I1203 11:15:37.936436 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.028317 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-bundle\") pod \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.028383 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m99x5\" (UniqueName: \"kubernetes.io/projected/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-kube-api-access-m99x5\") pod \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.028430 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-util\") pod \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\" (UID: \"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd\") " Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.030868 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-bundle" (OuterVolumeSpecName: "bundle") pod "d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" (UID: "d3aa75a4-714d-43ca-9a0f-82bd64ae31cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.036088 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-kube-api-access-m99x5" (OuterVolumeSpecName: "kube-api-access-m99x5") pod "d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" (UID: "d3aa75a4-714d-43ca-9a0f-82bd64ae31cd"). InnerVolumeSpecName "kube-api-access-m99x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.043331 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-util" (OuterVolumeSpecName: "util") pod "d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" (UID: "d3aa75a4-714d-43ca-9a0f-82bd64ae31cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.130915 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m99x5\" (UniqueName: \"kubernetes.io/projected/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-kube-api-access-m99x5\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.130961 4702 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-util\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.130972 4702 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d3aa75a4-714d-43ca-9a0f-82bd64ae31cd-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.461908 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" event={"ID":"d3aa75a4-714d-43ca-9a0f-82bd64ae31cd","Type":"ContainerDied","Data":"dc8433487d1e4b25761edb7991ad5010ebb5eacc2a3a822eeafdd6587d146227"} Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.462410 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc8433487d1e4b25761edb7991ad5010ebb5eacc2a3a822eeafdd6587d146227" Dec 03 11:15:38 crc kubenswrapper[4702]: I1203 11:15:38.461993 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9" Dec 03 11:15:39 crc kubenswrapper[4702]: I1203 11:15:39.479885 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zggrh" event={"ID":"b502f4ab-a954-46f4-a130-e5205b820f09","Type":"ContainerStarted","Data":"d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c"} Dec 03 11:15:39 crc kubenswrapper[4702]: I1203 11:15:39.506610 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zggrh" podStartSLOduration=3.576495298 podStartE2EDuration="8.506580952s" podCreationTimestamp="2025-12-03 11:15:31 +0000 UTC" firstStartedPulling="2025-12-03 11:15:33.392932313 +0000 UTC m=+717.228860777" lastFinishedPulling="2025-12-03 11:15:38.323017967 +0000 UTC m=+722.158946431" observedRunningTime="2025-12-03 11:15:39.501831917 +0000 UTC m=+723.337760381" watchObservedRunningTime="2025-12-03 11:15:39.506580952 +0000 UTC m=+723.342509416" Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.595366 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mt92m"] Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.596000 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovn-controller" containerID="cri-o://d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647" gracePeriod=30 Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.596122 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="nbdb" containerID="cri-o://eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56" gracePeriod=30 Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.596176 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovn-acl-logging" containerID="cri-o://9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76" gracePeriod=30 Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.596224 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="northd" containerID="cri-o://6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed" gracePeriod=30 Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.596122 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7" gracePeriod=30 Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.596417 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="sbdb" containerID="cri-o://f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e" gracePeriod=30 Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.596489 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kube-rbac-proxy-node" containerID="cri-o://8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06" gracePeriod=30 Dec 03 11:15:40 crc kubenswrapper[4702]: I1203 11:15:40.641629 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" containerID="cri-o://34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8" gracePeriod=30 Dec 03 11:15:42 crc kubenswrapper[4702]: I1203 11:15:42.309597 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:42 crc kubenswrapper[4702]: I1203 11:15:42.309669 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:43 crc kubenswrapper[4702]: I1203 11:15:43.378546 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zggrh" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="registry-server" probeResult="failure" output=< Dec 03 11:15:43 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:15:43 crc kubenswrapper[4702]: > Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.674022 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/2.log" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.677892 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovn-acl-logging/0.log" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.678574 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovn-controller/0.log" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.679218 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841375 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-54ljx"] Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841780 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841797 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841811 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kubecfg-setup" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841818 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kubecfg-setup" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841830 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841838 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841848 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="nbdb" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841854 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="nbdb" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841865 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovn-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841871 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovn-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841883 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerName="util" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841889 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerName="util" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841898 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="northd" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841905 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="northd" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841915 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kube-rbac-proxy-node" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841923 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kube-rbac-proxy-node" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841932 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerName="pull" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841938 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerName="pull" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841946 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovn-acl-logging" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841954 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovn-acl-logging" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841962 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841968 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841982 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="sbdb" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.841988 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="sbdb" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.841998 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842005 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.842013 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerName="extract" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842019 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerName="extract" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842166 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="nbdb" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842175 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3aa75a4-714d-43ca-9a0f-82bd64ae31cd" containerName="extract" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842183 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovn-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842190 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842198 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842208 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842217 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kube-rbac-proxy-node" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842227 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovn-acl-logging" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842235 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="northd" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842243 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="sbdb" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842250 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 11:15:47 crc kubenswrapper[4702]: E1203 11:15:47.842390 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842401 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.842528 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerName="ovnkube-controller" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852452 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-openvswitch\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852523 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-kubelet\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852603 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-node-log\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852636 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-etc-openvswitch\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852676 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-ovn\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852706 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-log-socket\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852749 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-env-overrides\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852789 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-var-lib-openvswitch\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852811 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88lx4\" (UniqueName: \"kubernetes.io/projected/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-kube-api-access-88lx4\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852844 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-bin\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852887 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovn-node-metrics-cert\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852906 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852927 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-slash\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852953 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-ovn-kubernetes\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852966 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-netd\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.852986 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-config\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.853011 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-netns\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.853026 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-systemd\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.853047 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-systemd-units\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.853072 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-script-lib\") pod \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\" (UID: \"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3\") " Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.853909 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.853964 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.853989 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.854008 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.854033 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-slash" (OuterVolumeSpecName: "host-slash") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.854055 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.854075 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.854942 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-log-socket" (OuterVolumeSpecName: "log-socket") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.855064 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-node-log" (OuterVolumeSpecName: "node-log") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.855094 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.855281 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.855844 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.856147 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.856309 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.856353 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.856389 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.856417 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.861142 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.897825 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-kube-api-access-88lx4" (OuterVolumeSpecName: "kube-api-access-88lx4") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "kube-api-access-88lx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.907083 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.909556 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" (UID: "ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955665 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzns6\" (UniqueName: \"kubernetes.io/projected/6ab9886b-f724-43b9-b365-4896d35349b9-kube-api-access-gzns6\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955735 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-ovn\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955790 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-log-socket\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955813 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-ovnkube-config\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955858 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-systemd-units\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955885 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-run-netns\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955903 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955921 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-cni-netd\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955941 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-var-lib-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955962 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-node-log\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.955990 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956008 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-systemd\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956054 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-ovnkube-script-lib\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956101 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-cni-bin\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956139 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-etc-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956164 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-kubelet\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956182 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ab9886b-f724-43b9-b365-4896d35349b9-ovn-node-metrics-cert\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956208 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956231 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-slash\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956252 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-env-overrides\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956301 4702 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956314 4702 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956327 4702 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956337 4702 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956349 4702 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956361 4702 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956373 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88lx4\" (UniqueName: \"kubernetes.io/projected/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-kube-api-access-88lx4\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956384 4702 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956395 4702 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956406 4702 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956416 4702 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956426 4702 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956438 4702 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956449 4702 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956461 4702 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956471 4702 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956480 4702 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956490 4702 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956498 4702 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:47 crc kubenswrapper[4702]: I1203 11:15:47.956509 4702 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.058143 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-etc-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.058600 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-kubelet\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.058679 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ab9886b-f724-43b9-b365-4896d35349b9-ovn-node-metrics-cert\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.058785 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.058942 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-slash\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059022 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059051 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-slash\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.058343 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-etc-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.058689 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-kubelet\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059065 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-env-overrides\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059345 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzns6\" (UniqueName: \"kubernetes.io/projected/6ab9886b-f724-43b9-b365-4896d35349b9-kube-api-access-gzns6\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059437 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-ovn\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059507 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-ovn\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059532 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-log-socket\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059641 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-log-socket\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059727 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-ovnkube-config\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059841 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-systemd-units\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.059924 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-run-netns\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060026 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060101 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-run-netns\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060073 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-env-overrides\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060113 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060042 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-systemd-units\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060120 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-cni-netd\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060317 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-var-lib-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060382 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-node-log\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060400 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060419 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-ovnkube-config\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060448 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-systemd\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060472 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-ovnkube-script-lib\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060486 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-node-log\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060546 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-var-lib-openvswitch\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060583 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-run-systemd\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060730 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-cni-bin\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060892 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-cni-netd\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060954 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-cni-bin\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.060906 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ab9886b-f724-43b9-b365-4896d35349b9-host-run-ovn-kubernetes\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.061391 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ab9886b-f724-43b9-b365-4896d35349b9-ovnkube-script-lib\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.073743 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ab9886b-f724-43b9-b365-4896d35349b9-ovn-node-metrics-cert\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.098714 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzns6\" (UniqueName: \"kubernetes.io/projected/6ab9886b-f724-43b9-b365-4896d35349b9-kube-api-access-gzns6\") pod \"ovnkube-node-54ljx\" (UID: \"6ab9886b-f724-43b9-b365-4896d35349b9\") " pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.111953 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk"] Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.113550 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.121583 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-jk7vg" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.125026 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.125660 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.191563 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd"] Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.192964 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.201285 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.201551 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-s6xr4" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.218733 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k"] Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.219832 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.252555 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.264686 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cff88\" (UniqueName: \"kubernetes.io/projected/db361c90-107c-4510-9683-659b755ebc42-kube-api-access-cff88\") pod \"obo-prometheus-operator-668cf9dfbb-h9rvk\" (UID: \"db361c90-107c-4510-9683-659b755ebc42\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.369145 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e5ed31e-a6b7-494f-ae2e-8c824e717092-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k\" (UID: \"5e5ed31e-a6b7-494f-ae2e-8c824e717092\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.369822 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a45f4fed-ae1f-4d04-8394-8208bbe31b44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd\" (UID: \"a45f4fed-ae1f-4d04-8394-8208bbe31b44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.369905 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e5ed31e-a6b7-494f-ae2e-8c824e717092-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k\" (UID: \"5e5ed31e-a6b7-494f-ae2e-8c824e717092\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.370001 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cff88\" (UniqueName: \"kubernetes.io/projected/db361c90-107c-4510-9683-659b755ebc42-kube-api-access-cff88\") pod \"obo-prometheus-operator-668cf9dfbb-h9rvk\" (UID: \"db361c90-107c-4510-9683-659b755ebc42\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.370048 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a45f4fed-ae1f-4d04-8394-8208bbe31b44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd\" (UID: \"a45f4fed-ae1f-4d04-8394-8208bbe31b44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.399128 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cff88\" (UniqueName: \"kubernetes.io/projected/db361c90-107c-4510-9683-659b755ebc42-kube-api-access-cff88\") pod \"obo-prometheus-operator-668cf9dfbb-h9rvk\" (UID: \"db361c90-107c-4510-9683-659b755ebc42\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.438450 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.471530 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a45f4fed-ae1f-4d04-8394-8208bbe31b44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd\" (UID: \"a45f4fed-ae1f-4d04-8394-8208bbe31b44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.471595 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e5ed31e-a6b7-494f-ae2e-8c824e717092-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k\" (UID: \"5e5ed31e-a6b7-494f-ae2e-8c824e717092\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.471663 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a45f4fed-ae1f-4d04-8394-8208bbe31b44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd\" (UID: \"a45f4fed-ae1f-4d04-8394-8208bbe31b44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.471708 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e5ed31e-a6b7-494f-ae2e-8c824e717092-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k\" (UID: \"5e5ed31e-a6b7-494f-ae2e-8c824e717092\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.479061 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a45f4fed-ae1f-4d04-8394-8208bbe31b44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd\" (UID: \"a45f4fed-ae1f-4d04-8394-8208bbe31b44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.481280 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5e5ed31e-a6b7-494f-ae2e-8c824e717092-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k\" (UID: \"5e5ed31e-a6b7-494f-ae2e-8c824e717092\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.483413 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a45f4fed-ae1f-4d04-8394-8208bbe31b44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd\" (UID: \"a45f4fed-ae1f-4d04-8394-8208bbe31b44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.486161 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5e5ed31e-a6b7-494f-ae2e-8c824e717092-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k\" (UID: \"5e5ed31e-a6b7-494f-ae2e-8c824e717092\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.518390 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovnkube-controller/2.log" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.519350 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.524039 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovn-acl-logging/0.log" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.524657 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mt92m_ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/ovn-controller/0.log" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.525029 4702 generic.go:334] "Generic (PLEG): container finished" podID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerID="34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8" exitCode=0 Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.525095 4702 generic.go:334] "Generic (PLEG): container finished" podID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerID="9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76" exitCode=143 Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.525105 4702 generic.go:334] "Generic (PLEG): container finished" podID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" containerID="d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647" exitCode=143 Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.525141 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8"} Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.525192 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76"} Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.525202 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647"} Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.525225 4702 scope.go:117] "RemoveContainer" containerID="34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.541296 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.627383 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators_db361c90-107c-4510-9683-659b755ebc42_0(211f90624ba306757621bf862e473177626936bc0f19e233e9b89a25971716a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.627553 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators_db361c90-107c-4510-9683-659b755ebc42_0(211f90624ba306757621bf862e473177626936bc0f19e233e9b89a25971716a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.627591 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators_db361c90-107c-4510-9683-659b755ebc42_0(211f90624ba306757621bf862e473177626936bc0f19e233e9b89a25971716a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.627669 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators(db361c90-107c-4510-9683-659b755ebc42)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators(db361c90-107c-4510-9683-659b755ebc42)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators_db361c90-107c-4510-9683-659b755ebc42_0(211f90624ba306757621bf862e473177626936bc0f19e233e9b89a25971716a3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" podUID="db361c90-107c-4510-9683-659b755ebc42" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.653449 4702 scope.go:117] "RemoveContainer" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.689783 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators_a45f4fed-ae1f-4d04-8394-8208bbe31b44_0(9976edcdabf4f1e4878fb076572414a5db8e706ea3a92a7ded819da718ac0dd8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.689903 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators_a45f4fed-ae1f-4d04-8394-8208bbe31b44_0(9976edcdabf4f1e4878fb076572414a5db8e706ea3a92a7ded819da718ac0dd8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.689945 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators_a45f4fed-ae1f-4d04-8394-8208bbe31b44_0(9976edcdabf4f1e4878fb076572414a5db8e706ea3a92a7ded819da718ac0dd8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.690022 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators(a45f4fed-ae1f-4d04-8394-8208bbe31b44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators(a45f4fed-ae1f-4d04-8394-8208bbe31b44)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators_a45f4fed-ae1f-4d04-8394-8208bbe31b44_0(9976edcdabf4f1e4878fb076572414a5db8e706ea3a92a7ded819da718ac0dd8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" podUID="a45f4fed-ae1f-4d04-8394-8208bbe31b44" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.704110 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-55ql9"] Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.713514 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.717697 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-96st4" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.717783 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.746102 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators_5e5ed31e-a6b7-494f-ae2e-8c824e717092_0(efa0f4f542ba2962fe540df5eab579b0561f2a588b85539b8725756517468c48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.746278 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators_5e5ed31e-a6b7-494f-ae2e-8c824e717092_0(efa0f4f542ba2962fe540df5eab579b0561f2a588b85539b8725756517468c48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.746342 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators_5e5ed31e-a6b7-494f-ae2e-8c824e717092_0(efa0f4f542ba2962fe540df5eab579b0561f2a588b85539b8725756517468c48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:48 crc kubenswrapper[4702]: E1203 11:15:48.746470 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators(5e5ed31e-a6b7-494f-ae2e-8c824e717092)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators(5e5ed31e-a6b7-494f-ae2e-8c824e717092)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators_5e5ed31e-a6b7-494f-ae2e-8c824e717092_0(efa0f4f542ba2962fe540df5eab579b0561f2a588b85539b8725756517468c48): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" podUID="5e5ed31e-a6b7-494f-ae2e-8c824e717092" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.782563 4702 scope.go:117] "RemoveContainer" containerID="f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.806366 4702 scope.go:117] "RemoveContainer" containerID="eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.825927 4702 scope.go:117] "RemoveContainer" containerID="6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.850257 4702 scope.go:117] "RemoveContainer" containerID="c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.851436 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-vpr8z"] Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.852523 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.857307 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-gzcbm" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.877930 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncvtv\" (UniqueName: \"kubernetes.io/projected/89d80ae9-23a4-4c91-a04d-7343d8a4df05-kube-api-access-ncvtv\") pod \"observability-operator-d8bb48f5d-55ql9\" (UID: \"89d80ae9-23a4-4c91-a04d-7343d8a4df05\") " pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.878021 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/89d80ae9-23a4-4c91-a04d-7343d8a4df05-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-55ql9\" (UID: \"89d80ae9-23a4-4c91-a04d-7343d8a4df05\") " pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.885725 4702 scope.go:117] "RemoveContainer" containerID="8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.909335 4702 scope.go:117] "RemoveContainer" containerID="9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.929572 4702 scope.go:117] "RemoveContainer" containerID="d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.959049 4702 scope.go:117] "RemoveContainer" containerID="3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.981042 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hf6h\" (UniqueName: \"kubernetes.io/projected/ec0726c3-58ef-4a22-8e00-bae32d7d66ca-kube-api-access-5hf6h\") pod \"perses-operator-5446b9c989-vpr8z\" (UID: \"ec0726c3-58ef-4a22-8e00-bae32d7d66ca\") " pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.981135 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec0726c3-58ef-4a22-8e00-bae32d7d66ca-openshift-service-ca\") pod \"perses-operator-5446b9c989-vpr8z\" (UID: \"ec0726c3-58ef-4a22-8e00-bae32d7d66ca\") " pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.981198 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncvtv\" (UniqueName: \"kubernetes.io/projected/89d80ae9-23a4-4c91-a04d-7343d8a4df05-kube-api-access-ncvtv\") pod \"observability-operator-d8bb48f5d-55ql9\" (UID: \"89d80ae9-23a4-4c91-a04d-7343d8a4df05\") " pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.981287 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/89d80ae9-23a4-4c91-a04d-7343d8a4df05-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-55ql9\" (UID: \"89d80ae9-23a4-4c91-a04d-7343d8a4df05\") " pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:48 crc kubenswrapper[4702]: I1203 11:15:48.994548 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/89d80ae9-23a4-4c91-a04d-7343d8a4df05-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-55ql9\" (UID: \"89d80ae9-23a4-4c91-a04d-7343d8a4df05\") " pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.011459 4702 scope.go:117] "RemoveContainer" containerID="34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.013680 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncvtv\" (UniqueName: \"kubernetes.io/projected/89d80ae9-23a4-4c91-a04d-7343d8a4df05-kube-api-access-ncvtv\") pod \"observability-operator-d8bb48f5d-55ql9\" (UID: \"89d80ae9-23a4-4c91-a04d-7343d8a4df05\") " pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.018023 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8\": container with ID starting with 34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8 not found: ID does not exist" containerID="34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.018092 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8"} err="failed to get container status \"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8\": rpc error: code = NotFound desc = could not find container \"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8\": container with ID starting with 34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.018145 4702 scope.go:117] "RemoveContainer" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.024034 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\": container with ID starting with 3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049 not found: ID does not exist" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.024124 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049"} err="failed to get container status \"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\": rpc error: code = NotFound desc = could not find container \"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\": container with ID starting with 3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.024171 4702 scope.go:117] "RemoveContainer" containerID="f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.025669 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\": container with ID starting with f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e not found: ID does not exist" containerID="f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.025708 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e"} err="failed to get container status \"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\": rpc error: code = NotFound desc = could not find container \"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\": container with ID starting with f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.025741 4702 scope.go:117] "RemoveContainer" containerID="eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.032230 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\": container with ID starting with eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56 not found: ID does not exist" containerID="eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.032306 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56"} err="failed to get container status \"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\": rpc error: code = NotFound desc = could not find container \"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\": container with ID starting with eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.032361 4702 scope.go:117] "RemoveContainer" containerID="6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.037026 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\": container with ID starting with 6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed not found: ID does not exist" containerID="6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.037104 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed"} err="failed to get container status \"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\": rpc error: code = NotFound desc = could not find container \"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\": container with ID starting with 6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.037150 4702 scope.go:117] "RemoveContainer" containerID="c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.044023 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\": container with ID starting with c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7 not found: ID does not exist" containerID="c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.044098 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7"} err="failed to get container status \"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\": rpc error: code = NotFound desc = could not find container \"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\": container with ID starting with c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.044149 4702 scope.go:117] "RemoveContainer" containerID="8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.050051 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\": container with ID starting with 8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06 not found: ID does not exist" containerID="8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.050152 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06"} err="failed to get container status \"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\": rpc error: code = NotFound desc = could not find container \"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\": container with ID starting with 8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.050198 4702 scope.go:117] "RemoveContainer" containerID="9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.058137 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\": container with ID starting with 9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76 not found: ID does not exist" containerID="9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.058220 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76"} err="failed to get container status \"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\": rpc error: code = NotFound desc = could not find container \"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\": container with ID starting with 9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.058285 4702 scope.go:117] "RemoveContainer" containerID="d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.070183 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\": container with ID starting with d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647 not found: ID does not exist" containerID="d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.070253 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647"} err="failed to get container status \"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\": rpc error: code = NotFound desc = could not find container \"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\": container with ID starting with d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.070294 4702 scope.go:117] "RemoveContainer" containerID="3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.078995 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\": container with ID starting with 3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b not found: ID does not exist" containerID="3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.079063 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b"} err="failed to get container status \"3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\": rpc error: code = NotFound desc = could not find container \"3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\": container with ID starting with 3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.079115 4702 scope.go:117] "RemoveContainer" containerID="34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.085151 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hf6h\" (UniqueName: \"kubernetes.io/projected/ec0726c3-58ef-4a22-8e00-bae32d7d66ca-kube-api-access-5hf6h\") pod \"perses-operator-5446b9c989-vpr8z\" (UID: \"ec0726c3-58ef-4a22-8e00-bae32d7d66ca\") " pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.085251 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec0726c3-58ef-4a22-8e00-bae32d7d66ca-openshift-service-ca\") pod \"perses-operator-5446b9c989-vpr8z\" (UID: \"ec0726c3-58ef-4a22-8e00-bae32d7d66ca\") " pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.086329 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec0726c3-58ef-4a22-8e00-bae32d7d66ca-openshift-service-ca\") pod \"perses-operator-5446b9c989-vpr8z\" (UID: \"ec0726c3-58ef-4a22-8e00-bae32d7d66ca\") " pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.086981 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8"} err="failed to get container status \"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8\": rpc error: code = NotFound desc = could not find container \"34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8\": container with ID starting with 34373fab2cbb28447d05dc7a38a6700cb01fa6cd07c07f1640dcff6d6e2e97f8 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.087051 4702 scope.go:117] "RemoveContainer" containerID="3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.088342 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049"} err="failed to get container status \"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\": rpc error: code = NotFound desc = could not find container \"3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049\": container with ID starting with 3ff7791b972c3d04a7ace965c292a533b98f478047e0fc61de8edf0553911049 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.088386 4702 scope.go:117] "RemoveContainer" containerID="f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.090407 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e"} err="failed to get container status \"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\": rpc error: code = NotFound desc = could not find container \"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e\": container with ID starting with f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.090465 4702 scope.go:117] "RemoveContainer" containerID="eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.093120 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56"} err="failed to get container status \"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\": rpc error: code = NotFound desc = could not find container \"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56\": container with ID starting with eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.093146 4702 scope.go:117] "RemoveContainer" containerID="6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.093357 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed"} err="failed to get container status \"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\": rpc error: code = NotFound desc = could not find container \"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed\": container with ID starting with 6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.093375 4702 scope.go:117] "RemoveContainer" containerID="c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097090 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7"} err="failed to get container status \"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\": rpc error: code = NotFound desc = could not find container \"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7\": container with ID starting with c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097116 4702 scope.go:117] "RemoveContainer" containerID="8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097308 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06"} err="failed to get container status \"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\": rpc error: code = NotFound desc = could not find container \"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06\": container with ID starting with 8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097324 4702 scope.go:117] "RemoveContainer" containerID="9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097495 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76"} err="failed to get container status \"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\": rpc error: code = NotFound desc = could not find container \"9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76\": container with ID starting with 9fa7d67c01ccc85192c95ea3a1b0b7aa6955c2eacb91f0b9136cc76dc5162e76 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097508 4702 scope.go:117] "RemoveContainer" containerID="d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097668 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647"} err="failed to get container status \"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\": rpc error: code = NotFound desc = could not find container \"d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647\": container with ID starting with d9b8a465547bbfd4558a84900ed3b5d2a8505133a34ce22cd8b0279b33b81647 not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097685 4702 scope.go:117] "RemoveContainer" containerID="3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.097902 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b"} err="failed to get container status \"3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\": rpc error: code = NotFound desc = could not find container \"3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b\": container with ID starting with 3aa472a8bc76aef399306a0301dcf691176958ec13273245bc9e05c2011ee44b not found: ID does not exist" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.103286 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.139139 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hf6h\" (UniqueName: \"kubernetes.io/projected/ec0726c3-58ef-4a22-8e00-bae32d7d66ca-kube-api-access-5hf6h\") pod \"perses-operator-5446b9c989-vpr8z\" (UID: \"ec0726c3-58ef-4a22-8e00-bae32d7d66ca\") " pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.177044 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.177825 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-55ql9_openshift-operators_89d80ae9-23a4-4c91-a04d-7343d8a4df05_0(f3d590838b069cb5712718af85eb43f9d90bbe7c225bd1c057a887dd76689253): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.177899 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-55ql9_openshift-operators_89d80ae9-23a4-4c91-a04d-7343d8a4df05_0(f3d590838b069cb5712718af85eb43f9d90bbe7c225bd1c057a887dd76689253): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.177932 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-55ql9_openshift-operators_89d80ae9-23a4-4c91-a04d-7343d8a4df05_0(f3d590838b069cb5712718af85eb43f9d90bbe7c225bd1c057a887dd76689253): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.177983 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-55ql9_openshift-operators(89d80ae9-23a4-4c91-a04d-7343d8a4df05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-55ql9_openshift-operators(89d80ae9-23a4-4c91-a04d-7343d8a4df05)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-55ql9_openshift-operators_89d80ae9-23a4-4c91-a04d-7343d8a4df05_0(f3d590838b069cb5712718af85eb43f9d90bbe7c225bd1c057a887dd76689253): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podUID="89d80ae9-23a4-4c91-a04d-7343d8a4df05" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.209008 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-vpr8z_openshift-operators_ec0726c3-58ef-4a22-8e00-bae32d7d66ca_0(addd7da46a97ef7fc89fef3a97c730943f61202b236c5184b08b913565126460): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.209113 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-vpr8z_openshift-operators_ec0726c3-58ef-4a22-8e00-bae32d7d66ca_0(addd7da46a97ef7fc89fef3a97c730943f61202b236c5184b08b913565126460): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.209143 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-vpr8z_openshift-operators_ec0726c3-58ef-4a22-8e00-bae32d7d66ca_0(addd7da46a97ef7fc89fef3a97c730943f61202b236c5184b08b913565126460): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:49 crc kubenswrapper[4702]: E1203 11:15:49.209203 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-vpr8z_openshift-operators(ec0726c3-58ef-4a22-8e00-bae32d7d66ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-vpr8z_openshift-operators(ec0726c3-58ef-4a22-8e00-bae32d7d66ca)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-vpr8z_openshift-operators_ec0726c3-58ef-4a22-8e00-bae32d7d66ca_0(addd7da46a97ef7fc89fef3a97c730943f61202b236c5184b08b913565126460): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.534887 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pqn7q_72be0494-b56e-4d46-8300-decd11c66d66/kube-multus/1.log" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.535410 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pqn7q_72be0494-b56e-4d46-8300-decd11c66d66/kube-multus/0.log" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.535491 4702 generic.go:334] "Generic (PLEG): container finished" podID="72be0494-b56e-4d46-8300-decd11c66d66" containerID="e471d9a027dca8cb83cc35128b6bfbae033ceaedc00e539a0fd45d154da78c4c" exitCode=2 Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.535605 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pqn7q" event={"ID":"72be0494-b56e-4d46-8300-decd11c66d66","Type":"ContainerDied","Data":"e471d9a027dca8cb83cc35128b6bfbae033ceaedc00e539a0fd45d154da78c4c"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.535726 4702 scope.go:117] "RemoveContainer" containerID="4daa6a931d589cf0a58e2601b19762972dfa4d22ca3d850679bdb26bd6d5ab0e" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.536740 4702 scope.go:117] "RemoveContainer" containerID="e471d9a027dca8cb83cc35128b6bfbae033ceaedc00e539a0fd45d154da78c4c" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.537838 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"f9759f5309bc3bdb8cb80a3adc9d19c61f984dd09d834a67393df84828763e9e"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.537877 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.537883 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"eb7bf1d4e498b4a41cc89c3b656a2db18260ac8b529019a576f3fa4b16895e56"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.537977 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"6fc3a85f58b445449645af460869bb24a2a53fde1f2d0202194e298fc41628ed"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.537991 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"c2b5f22cde3f909a47b20f60e6df1ff3fff374d1bd55420db05b9b85a90491f7"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.538007 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"8b5203beff5bca8b1f54322bbc2c7265174c9301e9bb8061d842e4ecfb22ac06"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.538027 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mt92m" event={"ID":"ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3","Type":"ContainerDied","Data":"00be1d4251a12dcc580afeaeedac98ebbb9c6320647fe866bcb421f86823bccd"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.540465 4702 generic.go:334] "Generic (PLEG): container finished" podID="6ab9886b-f724-43b9-b365-4896d35349b9" containerID="c6c602caf021c63cf20f2b8f3ffcc92e6e8d1b7c5b54f719df968206d86fe419" exitCode=0 Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.540499 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerDied","Data":"c6c602caf021c63cf20f2b8f3ffcc92e6e8d1b7c5b54f719df968206d86fe419"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.540518 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"cf2e4a50f15b6e11cee88ebc6bbd70a27179838d3ada00c10fa381ebc08b6add"} Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.699395 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mt92m"] Dec 03 11:15:49 crc kubenswrapper[4702]: I1203 11:15:49.712739 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mt92m"] Dec 03 11:15:50 crc kubenswrapper[4702]: I1203 11:15:50.557298 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"dd2c27bd601dc3fe16b943bcdf432191bdec91d6acf679e6a3a7d64d38275e9a"} Dec 03 11:15:50 crc kubenswrapper[4702]: I1203 11:15:50.557367 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"be93a1b848cf44f8c0fde4b9fa46ca26096d20814258acd2943e6ff8c29e768f"} Dec 03 11:15:50 crc kubenswrapper[4702]: I1203 11:15:50.560215 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pqn7q_72be0494-b56e-4d46-8300-decd11c66d66/kube-multus/1.log" Dec 03 11:15:50 crc kubenswrapper[4702]: I1203 11:15:50.560478 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pqn7q" event={"ID":"72be0494-b56e-4d46-8300-decd11c66d66","Type":"ContainerStarted","Data":"fe2257c1ae3f4653c25c6c52b04b2c2514a9b93aced3dbeffa39cbe6503c0973"} Dec 03 11:15:51 crc kubenswrapper[4702]: I1203 11:15:51.024421 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3" path="/var/lib/kubelet/pods/ffa7620d-1ec2-4a53-ad2e-df64bb9aeac3/volumes" Dec 03 11:15:51 crc kubenswrapper[4702]: I1203 11:15:51.636373 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"4cd378fde0eba089c3b89f9e872ac9306a66205dc388291f201283d53c020d0a"} Dec 03 11:15:51 crc kubenswrapper[4702]: I1203 11:15:51.636551 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"d417ffbf91d8a082bc90bd84d6dbfafafedd03a89e40b428fcc2d31043628b13"} Dec 03 11:15:52 crc kubenswrapper[4702]: I1203 11:15:52.366164 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:52 crc kubenswrapper[4702]: I1203 11:15:52.430950 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:52 crc kubenswrapper[4702]: I1203 11:15:52.649460 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zggrh"] Dec 03 11:15:52 crc kubenswrapper[4702]: I1203 11:15:52.653836 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"45e062693d2ca092f2493032bf0e5eafd85d9023ef2dac1d5dbe31f4dda952cb"} Dec 03 11:15:52 crc kubenswrapper[4702]: I1203 11:15:52.653897 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"dcc0541f5c3934c4d5617a7e9c7e8733dcf559ccfbbcf900b84f1ad4c02e721c"} Dec 03 11:15:53 crc kubenswrapper[4702]: I1203 11:15:53.661124 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zggrh" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="registry-server" containerID="cri-o://d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c" gracePeriod=2 Dec 03 11:15:53 crc kubenswrapper[4702]: I1203 11:15:53.917314 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.058258 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-utilities\") pod \"b502f4ab-a954-46f4-a130-e5205b820f09\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.058365 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-catalog-content\") pod \"b502f4ab-a954-46f4-a130-e5205b820f09\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.058451 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxw5s\" (UniqueName: \"kubernetes.io/projected/b502f4ab-a954-46f4-a130-e5205b820f09-kube-api-access-qxw5s\") pod \"b502f4ab-a954-46f4-a130-e5205b820f09\" (UID: \"b502f4ab-a954-46f4-a130-e5205b820f09\") " Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.060065 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-utilities" (OuterVolumeSpecName: "utilities") pod "b502f4ab-a954-46f4-a130-e5205b820f09" (UID: "b502f4ab-a954-46f4-a130-e5205b820f09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.068936 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b502f4ab-a954-46f4-a130-e5205b820f09-kube-api-access-qxw5s" (OuterVolumeSpecName: "kube-api-access-qxw5s") pod "b502f4ab-a954-46f4-a130-e5205b820f09" (UID: "b502f4ab-a954-46f4-a130-e5205b820f09"). InnerVolumeSpecName "kube-api-access-qxw5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.161193 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.161277 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxw5s\" (UniqueName: \"kubernetes.io/projected/b502f4ab-a954-46f4-a130-e5205b820f09-kube-api-access-qxw5s\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.175073 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b502f4ab-a954-46f4-a130-e5205b820f09" (UID: "b502f4ab-a954-46f4-a130-e5205b820f09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.263095 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b502f4ab-a954-46f4-a130-e5205b820f09-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.674169 4702 generic.go:334] "Generic (PLEG): container finished" podID="b502f4ab-a954-46f4-a130-e5205b820f09" containerID="d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c" exitCode=0 Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.674244 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zggrh" event={"ID":"b502f4ab-a954-46f4-a130-e5205b820f09","Type":"ContainerDied","Data":"d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c"} Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.674271 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zggrh" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.674331 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zggrh" event={"ID":"b502f4ab-a954-46f4-a130-e5205b820f09","Type":"ContainerDied","Data":"72324fef7a60410683165b8fbfdac61ba37cf9ee3a8502db7a4c76a9673963c5"} Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.674361 4702 scope.go:117] "RemoveContainer" containerID="d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.681420 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"fb57c8c32b32115c567e407c29eaba71ae06dc30f354a94ce5770d007f3c4e19"} Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.694453 4702 scope.go:117] "RemoveContainer" containerID="bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.713924 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zggrh"] Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.720897 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zggrh"] Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.748861 4702 scope.go:117] "RemoveContainer" containerID="b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.766530 4702 scope.go:117] "RemoveContainer" containerID="d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c" Dec 03 11:15:54 crc kubenswrapper[4702]: E1203 11:15:54.767247 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c\": container with ID starting with d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c not found: ID does not exist" containerID="d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.767309 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c"} err="failed to get container status \"d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c\": rpc error: code = NotFound desc = could not find container \"d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c\": container with ID starting with d978830cb98bf71459d6e7f9b56311644e201983e9b0201d1e97dfb982c7b84c not found: ID does not exist" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.767351 4702 scope.go:117] "RemoveContainer" containerID="bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4" Dec 03 11:15:54 crc kubenswrapper[4702]: E1203 11:15:54.767642 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4\": container with ID starting with bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4 not found: ID does not exist" containerID="bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.767670 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4"} err="failed to get container status \"bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4\": rpc error: code = NotFound desc = could not find container \"bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4\": container with ID starting with bd610c868eab0c82d1e5bc3fe4e06cebfca509285f0c1d6361a75f7b4e2203d4 not found: ID does not exist" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.767689 4702 scope.go:117] "RemoveContainer" containerID="b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476" Dec 03 11:15:54 crc kubenswrapper[4702]: E1203 11:15:54.768402 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476\": container with ID starting with b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476 not found: ID does not exist" containerID="b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.768461 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476"} err="failed to get container status \"b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476\": rpc error: code = NotFound desc = could not find container \"b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476\": container with ID starting with b5297805d76d2de1e5f85146cc215292b85a115768ce42f77af3994d5ba32476 not found: ID does not exist" Dec 03 11:15:54 crc kubenswrapper[4702]: I1203 11:15:54.937843 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" path="/var/lib/kubelet/pods/b502f4ab-a954-46f4-a130-e5205b820f09/volumes" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.722589 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" event={"ID":"6ab9886b-f724-43b9-b365-4896d35349b9","Type":"ContainerStarted","Data":"bbb7f9d3a316676e4af9f1dd92a7e8b3bb9e6f8f93a747cc2ff9d72328e76e81"} Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.723390 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.723401 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.754221 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-55ql9"] Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.754439 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.757553 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.775094 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" podStartSLOduration=11.775059947 podStartE2EDuration="11.775059947s" podCreationTimestamp="2025-12-03 11:15:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:15:58.772817444 +0000 UTC m=+742.608745938" watchObservedRunningTime="2025-12-03 11:15:58.775059947 +0000 UTC m=+742.610988411" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.793376 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.800553 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k"] Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.800771 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.801435 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.825586 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-vpr8z"] Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.835975 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk"] Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.836170 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.841660 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.846233 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.862695 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd"] Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.864637 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.865331 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:58 crc kubenswrapper[4702]: I1203 11:15:58.868986 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.891497 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-55ql9_openshift-operators_89d80ae9-23a4-4c91-a04d-7343d8a4df05_0(cbab56ce22601888aa6bf480ee8011052d6e6563cbe71918945fbb43f2e9a704): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.891624 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-55ql9_openshift-operators_89d80ae9-23a4-4c91-a04d-7343d8a4df05_0(cbab56ce22601888aa6bf480ee8011052d6e6563cbe71918945fbb43f2e9a704): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.891665 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-55ql9_openshift-operators_89d80ae9-23a4-4c91-a04d-7343d8a4df05_0(cbab56ce22601888aa6bf480ee8011052d6e6563cbe71918945fbb43f2e9a704): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.891872 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-d8bb48f5d-55ql9_openshift-operators(89d80ae9-23a4-4c91-a04d-7343d8a4df05)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-d8bb48f5d-55ql9_openshift-operators(89d80ae9-23a4-4c91-a04d-7343d8a4df05)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-d8bb48f5d-55ql9_openshift-operators_89d80ae9-23a4-4c91-a04d-7343d8a4df05_0(cbab56ce22601888aa6bf480ee8011052d6e6563cbe71918945fbb43f2e9a704): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podUID="89d80ae9-23a4-4c91-a04d-7343d8a4df05" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.962683 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators_5e5ed31e-a6b7-494f-ae2e-8c824e717092_0(949c28184ac21dd975b22f90552a6b6f4c59ed9963b14198caa131e652b2b57b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.962850 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators_5e5ed31e-a6b7-494f-ae2e-8c824e717092_0(949c28184ac21dd975b22f90552a6b6f4c59ed9963b14198caa131e652b2b57b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.962882 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators_5e5ed31e-a6b7-494f-ae2e-8c824e717092_0(949c28184ac21dd975b22f90552a6b6f4c59ed9963b14198caa131e652b2b57b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.962966 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators(5e5ed31e-a6b7-494f-ae2e-8c824e717092)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators(5e5ed31e-a6b7-494f-ae2e-8c824e717092)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators_5e5ed31e-a6b7-494f-ae2e-8c824e717092_0(949c28184ac21dd975b22f90552a6b6f4c59ed9963b14198caa131e652b2b57b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" podUID="5e5ed31e-a6b7-494f-ae2e-8c824e717092" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.979639 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-vpr8z_openshift-operators_ec0726c3-58ef-4a22-8e00-bae32d7d66ca_0(4ff89e05799e761696d2481c953e4ec7779b02947b740231d314028bd9759755): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.979767 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-vpr8z_openshift-operators_ec0726c3-58ef-4a22-8e00-bae32d7d66ca_0(4ff89e05799e761696d2481c953e4ec7779b02947b740231d314028bd9759755): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.979821 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-vpr8z_openshift-operators_ec0726c3-58ef-4a22-8e00-bae32d7d66ca_0(4ff89e05799e761696d2481c953e4ec7779b02947b740231d314028bd9759755): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.979900 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5446b9c989-vpr8z_openshift-operators(ec0726c3-58ef-4a22-8e00-bae32d7d66ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5446b9c989-vpr8z_openshift-operators(ec0726c3-58ef-4a22-8e00-bae32d7d66ca)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5446b9c989-vpr8z_openshift-operators_ec0726c3-58ef-4a22-8e00-bae32d7d66ca_0(4ff89e05799e761696d2481c953e4ec7779b02947b740231d314028bd9759755): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.995961 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators_a45f4fed-ae1f-4d04-8394-8208bbe31b44_0(24c1cc3be7e98a57e004dc9d0466dc82a69b78b69e3778ba4ca97adbb26118f9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.996070 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators_a45f4fed-ae1f-4d04-8394-8208bbe31b44_0(24c1cc3be7e98a57e004dc9d0466dc82a69b78b69e3778ba4ca97adbb26118f9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.996107 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators_a45f4fed-ae1f-4d04-8394-8208bbe31b44_0(24c1cc3be7e98a57e004dc9d0466dc82a69b78b69e3778ba4ca97adbb26118f9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:15:58 crc kubenswrapper[4702]: E1203 11:15:58.996168 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators(a45f4fed-ae1f-4d04-8394-8208bbe31b44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators(a45f4fed-ae1f-4d04-8394-8208bbe31b44)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_openshift-operators_a45f4fed-ae1f-4d04-8394-8208bbe31b44_0(24c1cc3be7e98a57e004dc9d0466dc82a69b78b69e3778ba4ca97adbb26118f9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" podUID="a45f4fed-ae1f-4d04-8394-8208bbe31b44" Dec 03 11:15:59 crc kubenswrapper[4702]: E1203 11:15:59.012394 4702 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators_db361c90-107c-4510-9683-659b755ebc42_0(ec2cdb787d0c875fe1aaee01e5766dc63005079046022dab666012f2b3077f93): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 11:15:59 crc kubenswrapper[4702]: E1203 11:15:59.012528 4702 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators_db361c90-107c-4510-9683-659b755ebc42_0(ec2cdb787d0c875fe1aaee01e5766dc63005079046022dab666012f2b3077f93): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:59 crc kubenswrapper[4702]: E1203 11:15:59.012584 4702 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators_db361c90-107c-4510-9683-659b755ebc42_0(ec2cdb787d0c875fe1aaee01e5766dc63005079046022dab666012f2b3077f93): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:15:59 crc kubenswrapper[4702]: E1203 11:15:59.012664 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators(db361c90-107c-4510-9683-659b755ebc42)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators(db361c90-107c-4510-9683-659b755ebc42)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators_db361c90-107c-4510-9683-659b755ebc42_0(ec2cdb787d0c875fe1aaee01e5766dc63005079046022dab666012f2b3077f93): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" podUID="db361c90-107c-4510-9683-659b755ebc42" Dec 03 11:15:59 crc kubenswrapper[4702]: I1203 11:15:59.729663 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:15:59 crc kubenswrapper[4702]: I1203 11:15:59.820573 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:16:10 crc kubenswrapper[4702]: I1203 11:16:10.927417 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:16:10 crc kubenswrapper[4702]: I1203 11:16:10.929570 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:16:11 crc kubenswrapper[4702]: I1203 11:16:11.241155 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-55ql9"] Dec 03 11:16:11 crc kubenswrapper[4702]: I1203 11:16:11.828784 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" event={"ID":"89d80ae9-23a4-4c91-a04d-7343d8a4df05","Type":"ContainerStarted","Data":"958f53201239bd551f120cb00ec9c53fa7b6c165b9be8d1c1ba8677d22fb53fb"} Dec 03 11:16:12 crc kubenswrapper[4702]: I1203 11:16:12.932330 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:16:12 crc kubenswrapper[4702]: I1203 11:16:12.932703 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:16:12 crc kubenswrapper[4702]: I1203 11:16:12.933647 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" Dec 03 11:16:12 crc kubenswrapper[4702]: I1203 11:16:12.934069 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" Dec 03 11:16:12 crc kubenswrapper[4702]: I1203 11:16:12.936541 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:16:12 crc kubenswrapper[4702]: I1203 11:16:12.937336 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:16:13 crc kubenswrapper[4702]: I1203 11:16:13.292576 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk"] Dec 03 11:16:13 crc kubenswrapper[4702]: W1203 11:16:13.316246 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb361c90_107c_4510_9683_659b755ebc42.slice/crio-0917bb22273ccb7389a61cf83102d188648387e3004ae684a8c8f0ff88aa75d5 WatchSource:0}: Error finding container 0917bb22273ccb7389a61cf83102d188648387e3004ae684a8c8f0ff88aa75d5: Status 404 returned error can't find the container with id 0917bb22273ccb7389a61cf83102d188648387e3004ae684a8c8f0ff88aa75d5 Dec 03 11:16:13 crc kubenswrapper[4702]: I1203 11:16:13.556002 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-vpr8z"] Dec 03 11:16:13 crc kubenswrapper[4702]: W1203 11:16:13.574645 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-b9e1d6d075fbcede7bb619d6d24da8d0742f0b2b6a485093e36c547cae7af120 WatchSource:0}: Error finding container b9e1d6d075fbcede7bb619d6d24da8d0742f0b2b6a485093e36c547cae7af120: Status 404 returned error can't find the container with id b9e1d6d075fbcede7bb619d6d24da8d0742f0b2b6a485093e36c547cae7af120 Dec 03 11:16:13 crc kubenswrapper[4702]: I1203 11:16:13.580170 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k"] Dec 03 11:16:13 crc kubenswrapper[4702]: W1203 11:16:13.592949 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e5ed31e_a6b7_494f_ae2e_8c824e717092.slice/crio-9580975f8e62b71bbc171dd7ddb23b1ee7aa51882e5b0ab5d2236760a1f21368 WatchSource:0}: Error finding container 9580975f8e62b71bbc171dd7ddb23b1ee7aa51882e5b0ab5d2236760a1f21368: Status 404 returned error can't find the container with id 9580975f8e62b71bbc171dd7ddb23b1ee7aa51882e5b0ab5d2236760a1f21368 Dec 03 11:16:13 crc kubenswrapper[4702]: I1203 11:16:13.847787 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" event={"ID":"db361c90-107c-4510-9683-659b755ebc42","Type":"ContainerStarted","Data":"0917bb22273ccb7389a61cf83102d188648387e3004ae684a8c8f0ff88aa75d5"} Dec 03 11:16:13 crc kubenswrapper[4702]: I1203 11:16:13.850145 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" event={"ID":"5e5ed31e-a6b7-494f-ae2e-8c824e717092","Type":"ContainerStarted","Data":"9580975f8e62b71bbc171dd7ddb23b1ee7aa51882e5b0ab5d2236760a1f21368"} Dec 03 11:16:13 crc kubenswrapper[4702]: I1203 11:16:13.852722 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" event={"ID":"ec0726c3-58ef-4a22-8e00-bae32d7d66ca","Type":"ContainerStarted","Data":"b9e1d6d075fbcede7bb619d6d24da8d0742f0b2b6a485093e36c547cae7af120"} Dec 03 11:16:13 crc kubenswrapper[4702]: I1203 11:16:13.928112 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:16:13 crc kubenswrapper[4702]: I1203 11:16:13.928837 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" Dec 03 11:16:14 crc kubenswrapper[4702]: I1203 11:16:14.409718 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd"] Dec 03 11:16:14 crc kubenswrapper[4702]: I1203 11:16:14.867210 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" event={"ID":"a45f4fed-ae1f-4d04-8394-8208bbe31b44","Type":"ContainerStarted","Data":"176fa14a6216612f953ed361638a6647dddc463b951b2b9bca225b218cbafac5"} Dec 03 11:16:18 crc kubenswrapper[4702]: I1203 11:16:18.312574 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" Dec 03 11:16:30 crc kubenswrapper[4702]: E1203 11:16:30.339656 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb" Dec 03 11:16:30 crc kubenswrapper[4702]: E1203 11:16:30.341096 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb,Command:[],Args:[--namespace=$(NAMESPACE) --images=perses=$(RELATED_IMAGE_PERSES) --images=alertmanager=$(RELATED_IMAGE_ALERTMANAGER) --images=prometheus=$(RELATED_IMAGE_PROMETHEUS) --images=thanos=$(RELATED_IMAGE_THANOS) --images=ui-dashboards=$(RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN) --images=ui-distributed-tracing=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN) --images=ui-distributed-tracing-pf5=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5) --images=ui-distributed-tracing-pf4=$(RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4) --images=ui-logging=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN) --images=ui-logging-pf4=$(RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4) --images=ui-troubleshooting-panel=$(RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN) --images=ui-monitoring=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN) --images=ui-monitoring-pf5=$(RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5) --images=korrel8r=$(RELATED_IMAGE_KORREL8R) --images=health-analyzer=$(RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER) --openshift.enabled=true],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:RELATED_IMAGE_ALERTMANAGER,Value:registry.redhat.io/cluster-observability-operator/alertmanager-rhel9@sha256:e718854a7d6ca8accf0fa72db0eb902e46c44d747ad51dc3f06bba0cefaa3c01,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS,Value:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_THANOS,Value:registry.redhat.io/cluster-observability-operator/thanos-rhel9@sha256:d972f4faa5e9c121402d23ed85002f26af48ec36b1b71a7489d677b3913d08b4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PERSES,Value:registry.redhat.io/cluster-observability-operator/perses-rhel9@sha256:91531137fc1dcd740e277e0f65e120a0176a16f788c14c27925b61aa0b792ade,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DASHBOARDS_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-rhel9@sha256:897e1bfad1187062725b54d87107bd0155972257a50d8335dd29e1999b828a4f,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf5-rhel9@sha256:95fe5b5746ca8c07ac9217ce2d8ac8e6afad17af210f9d8e0074df1310b209a8,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_DISTRIBUTED_TRACING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/distributed-tracing-console-plugin-pf4-rhel9@sha256:e9d9a89e4d8126a62b1852055482258ee528cac6398dd5d43ebad75ace0f33c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-rhel9@sha256:ec684a0645ceb917b019af7ddba68c3533416e356ab0d0320a30e75ca7ebb31b,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_LOGGING_PLUGIN_PF4,Value:registry.redhat.io/cluster-observability-operator/logging-console-plugin-pf4-rhel9@sha256:3b9693fcde9b3a9494fb04735b1f7cfd0426f10be820fdc3f024175c0d3df1c9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_TROUBLESHOOTING_PANEL_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/troubleshooting-panel-console-plugin-rhel9@sha256:580606f194180accc8abba099e17a26dca7522ec6d233fa2fdd40312771703e3,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-rhel9@sha256:e03777be39e71701935059cd877603874a13ac94daa73219d4e5e545599d78a9,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CONSOLE_MONITORING_PLUGIN_PF5,Value:registry.redhat.io/cluster-observability-operator/monitoring-console-plugin-pf5-rhel9@sha256:aa47256193cfd2877853878e1ae97d2ab8b8e5deae62b387cbfad02b284d379c,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KORREL8R,Value:registry.redhat.io/cluster-observability-operator/korrel8r-rhel9@sha256:c595ff56b2cb85514bf4784db6ddb82e4e657e3e708a7fb695fc4997379a94d4,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLUSTER_HEALTH_ANALYZER,Value:registry.redhat.io/cluster-observability-operator/cluster-health-analyzer-rhel9@sha256:45a4ec2a519bcec99e886aa91596d5356a2414a2bd103baaef9fa7838c672eb2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{400 -3} {} 400m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:observability-operator-tls,ReadOnly:true,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ncvtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-operator-d8bb48f5d-55ql9_openshift-operators(89d80ae9-23a4-4c91-a04d-7343d8a4df05): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:16:30 crc kubenswrapper[4702]: E1203 11:16:30.342457 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podUID="89d80ae9-23a4-4c91-a04d-7343d8a4df05" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.052487 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/cluster-observability-rhel9-operator@sha256:ce7d2904f7b238aa37dfe74a0b76bf73629e7a14fa52bf54b0ecf030ca36f1bb\\\"\"" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podUID="89d80ae9-23a4-4c91-a04d-7343d8a4df05" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.125904 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.126370 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cff88,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-h9rvk_openshift-operators(db361c90-107c-4510-9683-659b755ebc42): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.127643 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" podUID="db361c90-107c-4510-9683-659b755ebc42" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.499301 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.500464 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:43d33f0125e6b990f4a972ac4e952a065d7e72dc1690c6c836963b7341734aec,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_openshift-operators(5e5ed31e-a6b7-494f-ae2e-8c824e717092): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.501742 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" podUID="5e5ed31e-a6b7-494f-ae2e-8c824e717092" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.936931 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.937219 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5hf6h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5446b9c989-vpr8z_openshift-operators(ec0726c3-58ef-4a22-8e00-bae32d7d66ca): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:16:31 crc kubenswrapper[4702]: E1203 11:16:31.938339 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" Dec 03 11:16:32 crc kubenswrapper[4702]: E1203 11:16:32.068576 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385\\\"\"" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" Dec 03 11:16:32 crc kubenswrapper[4702]: E1203 11:16:32.069093 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" podUID="db361c90-107c-4510-9683-659b755ebc42" Dec 03 11:16:33 crc kubenswrapper[4702]: I1203 11:16:33.076416 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" event={"ID":"5e5ed31e-a6b7-494f-ae2e-8c824e717092","Type":"ContainerStarted","Data":"e1d99757f300954f300c226e0925fcac9af0e8c87fb07b79e484fe8c63eae99e"} Dec 03 11:16:33 crc kubenswrapper[4702]: I1203 11:16:33.079572 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" event={"ID":"a45f4fed-ae1f-4d04-8394-8208bbe31b44","Type":"ContainerStarted","Data":"f7655ed734b68212c8f9ff1475c3fb1e5410a1b4392c32cab51d23613fb4c0c2"} Dec 03 11:16:33 crc kubenswrapper[4702]: I1203 11:16:33.111997 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k" podStartSLOduration=-9223371991.742813 podStartE2EDuration="45.111962883s" podCreationTimestamp="2025-12-03 11:15:48 +0000 UTC" firstStartedPulling="2025-12-03 11:16:13.597422452 +0000 UTC m=+757.433350916" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:16:33.095082953 +0000 UTC m=+776.931011437" watchObservedRunningTime="2025-12-03 11:16:33.111962883 +0000 UTC m=+776.947891347" Dec 03 11:16:33 crc kubenswrapper[4702]: I1203 11:16:33.134091 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd" podStartSLOduration=27.605105183 podStartE2EDuration="45.134053281s" podCreationTimestamp="2025-12-03 11:15:48 +0000 UTC" firstStartedPulling="2025-12-03 11:16:14.446570309 +0000 UTC m=+758.282498773" lastFinishedPulling="2025-12-03 11:16:31.975518407 +0000 UTC m=+775.811446871" observedRunningTime="2025-12-03 11:16:33.130233672 +0000 UTC m=+776.966162136" watchObservedRunningTime="2025-12-03 11:16:33.134053281 +0000 UTC m=+776.969981755" Dec 03 11:16:45 crc kubenswrapper[4702]: I1203 11:16:45.186682 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" event={"ID":"89d80ae9-23a4-4c91-a04d-7343d8a4df05","Type":"ContainerStarted","Data":"0f28fcc955b94e29136b5c7bf6a55a157126fb16bb2bab32566886f8a71f4e84"} Dec 03 11:16:45 crc kubenswrapper[4702]: I1203 11:16:45.188432 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:16:45 crc kubenswrapper[4702]: I1203 11:16:45.188725 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" event={"ID":"ec0726c3-58ef-4a22-8e00-bae32d7d66ca","Type":"ContainerStarted","Data":"57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca"} Dec 03 11:16:45 crc kubenswrapper[4702]: I1203 11:16:45.189071 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:16:45 crc kubenswrapper[4702]: I1203 11:16:45.216381 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podStartSLOduration=24.368352692 podStartE2EDuration="57.216356997s" podCreationTimestamp="2025-12-03 11:15:48 +0000 UTC" firstStartedPulling="2025-12-03 11:16:11.25666699 +0000 UTC m=+755.092595454" lastFinishedPulling="2025-12-03 11:16:44.104671295 +0000 UTC m=+787.940599759" observedRunningTime="2025-12-03 11:16:45.213510156 +0000 UTC m=+789.049438640" watchObservedRunningTime="2025-12-03 11:16:45.216356997 +0000 UTC m=+789.052285461" Dec 03 11:16:45 crc kubenswrapper[4702]: I1203 11:16:45.244636 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podStartSLOduration=26.738146741 podStartE2EDuration="57.244608721s" podCreationTimestamp="2025-12-03 11:15:48 +0000 UTC" firstStartedPulling="2025-12-03 11:16:13.577790544 +0000 UTC m=+757.413719018" lastFinishedPulling="2025-12-03 11:16:44.084252534 +0000 UTC m=+787.920180998" observedRunningTime="2025-12-03 11:16:45.240864534 +0000 UTC m=+789.076793028" watchObservedRunningTime="2025-12-03 11:16:45.244608721 +0000 UTC m=+789.080537185" Dec 03 11:16:45 crc kubenswrapper[4702]: I1203 11:16:45.330673 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" Dec 03 11:16:46 crc kubenswrapper[4702]: I1203 11:16:46.196563 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" event={"ID":"db361c90-107c-4510-9683-659b755ebc42","Type":"ContainerStarted","Data":"472c8605837d8034c42765031c4173624c754b7759c7b555d9c99e240c3c695e"} Dec 03 11:16:46 crc kubenswrapper[4702]: I1203 11:16:46.218213 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-h9rvk" podStartSLOduration=26.544108393 podStartE2EDuration="58.218181206s" podCreationTimestamp="2025-12-03 11:15:48 +0000 UTC" firstStartedPulling="2025-12-03 11:16:13.320061235 +0000 UTC m=+757.155989699" lastFinishedPulling="2025-12-03 11:16:44.994134048 +0000 UTC m=+788.830062512" observedRunningTime="2025-12-03 11:16:46.216749785 +0000 UTC m=+790.052678269" watchObservedRunningTime="2025-12-03 11:16:46.218181206 +0000 UTC m=+790.054109680" Dec 03 11:16:49 crc kubenswrapper[4702]: I1203 11:16:49.179972 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.417749 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-f9zmn"] Dec 03 11:16:53 crc kubenswrapper[4702]: E1203 11:16:53.419073 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="extract-utilities" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.419098 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="extract-utilities" Dec 03 11:16:53 crc kubenswrapper[4702]: E1203 11:16:53.419133 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="extract-content" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.419144 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="extract-content" Dec 03 11:16:53 crc kubenswrapper[4702]: E1203 11:16:53.419159 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="registry-server" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.419168 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="registry-server" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.419384 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="b502f4ab-a954-46f4-a130-e5205b820f09" containerName="registry-server" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.420268 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-f9zmn" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.423280 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.425970 4702 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-sp7nn" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.428077 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.443808 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kt9ll"] Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.444960 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-kt9ll" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.449447 4702 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-mgkfn" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.457634 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-f9zmn"] Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.464057 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kt9ll"] Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.479976 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-l8p84"] Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.481378 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.488172 4702 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-sttgq" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.505930 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-l8p84"] Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.520696 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76xp9\" (UniqueName: \"kubernetes.io/projected/c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe-kube-api-access-76xp9\") pod \"cert-manager-5b446d88c5-kt9ll\" (UID: \"c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe\") " pod="cert-manager/cert-manager-5b446d88c5-kt9ll" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.520802 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bf2q\" (UniqueName: \"kubernetes.io/projected/d966c899-cb05-41ea-b10b-820da56925f6-kube-api-access-5bf2q\") pod \"cert-manager-cainjector-7f985d654d-f9zmn\" (UID: \"d966c899-cb05-41ea-b10b-820da56925f6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-f9zmn" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.622258 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76xp9\" (UniqueName: \"kubernetes.io/projected/c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe-kube-api-access-76xp9\") pod \"cert-manager-5b446d88c5-kt9ll\" (UID: \"c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe\") " pod="cert-manager/cert-manager-5b446d88c5-kt9ll" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.622355 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bf2q\" (UniqueName: \"kubernetes.io/projected/d966c899-cb05-41ea-b10b-820da56925f6-kube-api-access-5bf2q\") pod \"cert-manager-cainjector-7f985d654d-f9zmn\" (UID: \"d966c899-cb05-41ea-b10b-820da56925f6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-f9zmn" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.622402 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpl8w\" (UniqueName: \"kubernetes.io/projected/40ccf765-6eb2-49e3-8f2c-635b1981639e-kube-api-access-tpl8w\") pod \"cert-manager-webhook-5655c58dd6-l8p84\" (UID: \"40ccf765-6eb2-49e3-8f2c-635b1981639e\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.652299 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76xp9\" (UniqueName: \"kubernetes.io/projected/c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe-kube-api-access-76xp9\") pod \"cert-manager-5b446d88c5-kt9ll\" (UID: \"c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe\") " pod="cert-manager/cert-manager-5b446d88c5-kt9ll" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.661509 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bf2q\" (UniqueName: \"kubernetes.io/projected/d966c899-cb05-41ea-b10b-820da56925f6-kube-api-access-5bf2q\") pod \"cert-manager-cainjector-7f985d654d-f9zmn\" (UID: \"d966c899-cb05-41ea-b10b-820da56925f6\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-f9zmn" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.724079 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpl8w\" (UniqueName: \"kubernetes.io/projected/40ccf765-6eb2-49e3-8f2c-635b1981639e-kube-api-access-tpl8w\") pod \"cert-manager-webhook-5655c58dd6-l8p84\" (UID: \"40ccf765-6eb2-49e3-8f2c-635b1981639e\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.743910 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-f9zmn" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.756012 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpl8w\" (UniqueName: \"kubernetes.io/projected/40ccf765-6eb2-49e3-8f2c-635b1981639e-kube-api-access-tpl8w\") pod \"cert-manager-webhook-5655c58dd6-l8p84\" (UID: \"40ccf765-6eb2-49e3-8f2c-635b1981639e\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.771721 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-kt9ll" Dec 03 11:16:53 crc kubenswrapper[4702]: I1203 11:16:53.810164 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" Dec 03 11:16:54 crc kubenswrapper[4702]: I1203 11:16:54.494549 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-f9zmn"] Dec 03 11:16:54 crc kubenswrapper[4702]: I1203 11:16:54.573377 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-l8p84"] Dec 03 11:16:54 crc kubenswrapper[4702]: I1203 11:16:54.617179 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-kt9ll"] Dec 03 11:16:54 crc kubenswrapper[4702]: W1203 11:16:54.621750 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3476e51_0ccc_41ee_8d43_4bf8b59a6bbe.slice/crio-19fe9b066fed1f283fe9d78d5add7a5ed09fd70984e557e26b5d1d90fb675b62 WatchSource:0}: Error finding container 19fe9b066fed1f283fe9d78d5add7a5ed09fd70984e557e26b5d1d90fb675b62: Status 404 returned error can't find the container with id 19fe9b066fed1f283fe9d78d5add7a5ed09fd70984e557e26b5d1d90fb675b62 Dec 03 11:16:55 crc kubenswrapper[4702]: I1203 11:16:55.290293 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-kt9ll" event={"ID":"c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe","Type":"ContainerStarted","Data":"19fe9b066fed1f283fe9d78d5add7a5ed09fd70984e557e26b5d1d90fb675b62"} Dec 03 11:16:55 crc kubenswrapper[4702]: I1203 11:16:55.292321 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-f9zmn" event={"ID":"d966c899-cb05-41ea-b10b-820da56925f6","Type":"ContainerStarted","Data":"1254b912cca5d40483e3df263e6b25e354b671a9872bc6a2a16fce8a1da0a6e0"} Dec 03 11:16:55 crc kubenswrapper[4702]: I1203 11:16:55.294970 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" event={"ID":"40ccf765-6eb2-49e3-8f2c-635b1981639e","Type":"ContainerStarted","Data":"23710d31c371f3a24cce4faeda5d3a0fc7d740cda0542fe3444a36b54d9806d1"} Dec 03 11:16:55 crc kubenswrapper[4702]: I1203 11:16:55.908139 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:16:55 crc kubenswrapper[4702]: I1203 11:16:55.908235 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:17:02 crc kubenswrapper[4702]: I1203 11:17:02.478544 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-kt9ll" event={"ID":"c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe","Type":"ContainerStarted","Data":"f848f0bd416ffa29880b58321d6c8cf8e927ab35b740bcfe430b723ea4747c70"} Dec 03 11:17:02 crc kubenswrapper[4702]: I1203 11:17:02.482562 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-f9zmn" event={"ID":"d966c899-cb05-41ea-b10b-820da56925f6","Type":"ContainerStarted","Data":"9648ab135fc470604d1a98aaffe4a79a528f9f3e0f802e80b60d0ad66e8d00d0"} Dec 03 11:17:02 crc kubenswrapper[4702]: I1203 11:17:02.490428 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" event={"ID":"40ccf765-6eb2-49e3-8f2c-635b1981639e","Type":"ContainerStarted","Data":"2797a8f87be81dcc320552ada4e2001bb538bd1d8c7a45ab81d8faecd48a935d"} Dec 03 11:17:02 crc kubenswrapper[4702]: I1203 11:17:02.491704 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" Dec 03 11:17:02 crc kubenswrapper[4702]: I1203 11:17:02.574324 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-kt9ll" podStartSLOduration=2.304863827 podStartE2EDuration="9.574289693s" podCreationTimestamp="2025-12-03 11:16:53 +0000 UTC" firstStartedPulling="2025-12-03 11:16:54.624730656 +0000 UTC m=+798.460659120" lastFinishedPulling="2025-12-03 11:17:01.894156522 +0000 UTC m=+805.730084986" observedRunningTime="2025-12-03 11:17:02.564567716 +0000 UTC m=+806.400496190" watchObservedRunningTime="2025-12-03 11:17:02.574289693 +0000 UTC m=+806.410218157" Dec 03 11:17:02 crc kubenswrapper[4702]: I1203 11:17:02.602177 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-f9zmn" podStartSLOduration=2.136756326 podStartE2EDuration="9.602135984s" podCreationTimestamp="2025-12-03 11:16:53 +0000 UTC" firstStartedPulling="2025-12-03 11:16:54.510985902 +0000 UTC m=+798.346914366" lastFinishedPulling="2025-12-03 11:17:01.97636556 +0000 UTC m=+805.812294024" observedRunningTime="2025-12-03 11:17:02.593135378 +0000 UTC m=+806.429063842" watchObservedRunningTime="2025-12-03 11:17:02.602135984 +0000 UTC m=+806.438064448" Dec 03 11:17:02 crc kubenswrapper[4702]: I1203 11:17:02.639986 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" podStartSLOduration=2.330100585 podStartE2EDuration="9.63994745s" podCreationTimestamp="2025-12-03 11:16:53 +0000 UTC" firstStartedPulling="2025-12-03 11:16:54.575526897 +0000 UTC m=+798.411455361" lastFinishedPulling="2025-12-03 11:17:01.885373762 +0000 UTC m=+805.721302226" observedRunningTime="2025-12-03 11:17:02.635349089 +0000 UTC m=+806.471277553" watchObservedRunningTime="2025-12-03 11:17:02.63994745 +0000 UTC m=+806.475875914" Dec 03 11:17:08 crc kubenswrapper[4702]: I1203 11:17:08.813252 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" Dec 03 11:17:25 crc kubenswrapper[4702]: I1203 11:17:25.911869 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:17:25 crc kubenswrapper[4702]: I1203 11:17:25.912756 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.390259 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk"] Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.392635 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.396302 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.408726 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk"] Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.469306 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.469692 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.469885 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85t4x\" (UniqueName: \"kubernetes.io/projected/aabceec8-9509-4d66-af3e-1b9d9a270b38-kube-api-access-85t4x\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.548102 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx"] Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.549427 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.561619 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx"] Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.570929 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85t4x\" (UniqueName: \"kubernetes.io/projected/aabceec8-9509-4d66-af3e-1b9d9a270b38-kube-api-access-85t4x\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.570994 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.571042 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.571076 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.571101 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.571121 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndk6m\" (UniqueName: \"kubernetes.io/projected/be1348c4-10f9-4f68-9ade-51ff820cd05a-kube-api-access-ndk6m\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.571567 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-util\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.572095 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-bundle\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.608219 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85t4x\" (UniqueName: \"kubernetes.io/projected/aabceec8-9509-4d66-af3e-1b9d9a270b38-kube-api-access-85t4x\") pod \"4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.671899 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.671982 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.672023 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndk6m\" (UniqueName: \"kubernetes.io/projected/be1348c4-10f9-4f68-9ade-51ff820cd05a-kube-api-access-ndk6m\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.672685 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-util\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.672683 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-bundle\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.695992 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndk6m\" (UniqueName: \"kubernetes.io/projected/be1348c4-10f9-4f68-9ade-51ff820cd05a-kube-api-access-ndk6m\") pod \"a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.709661 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:37 crc kubenswrapper[4702]: I1203 11:17:37.863785 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:38 crc kubenswrapper[4702]: I1203 11:17:38.017278 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk"] Dec 03 11:17:38 crc kubenswrapper[4702]: I1203 11:17:38.106098 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx"] Dec 03 11:17:38 crc kubenswrapper[4702]: I1203 11:17:38.770279 4702 generic.go:334] "Generic (PLEG): container finished" podID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerID="3e19a098bc6bebdfae0e3dc4a8875e513c940a8054c0564404a2d0c654faad10" exitCode=0 Dec 03 11:17:38 crc kubenswrapper[4702]: I1203 11:17:38.770340 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" event={"ID":"be1348c4-10f9-4f68-9ade-51ff820cd05a","Type":"ContainerDied","Data":"3e19a098bc6bebdfae0e3dc4a8875e513c940a8054c0564404a2d0c654faad10"} Dec 03 11:17:38 crc kubenswrapper[4702]: I1203 11:17:38.770956 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" event={"ID":"be1348c4-10f9-4f68-9ade-51ff820cd05a","Type":"ContainerStarted","Data":"3deabeb6a93c6b620f73017488df17f29e0ec695e634c73cddd6d3a4992a8f9f"} Dec 03 11:17:38 crc kubenswrapper[4702]: I1203 11:17:38.772983 4702 generic.go:334] "Generic (PLEG): container finished" podID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerID="e562e7535e587b3e1fd9d50244215777923323606552806dbaebd1a64740f094" exitCode=0 Dec 03 11:17:38 crc kubenswrapper[4702]: I1203 11:17:38.773047 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" event={"ID":"aabceec8-9509-4d66-af3e-1b9d9a270b38","Type":"ContainerDied","Data":"e562e7535e587b3e1fd9d50244215777923323606552806dbaebd1a64740f094"} Dec 03 11:17:38 crc kubenswrapper[4702]: I1203 11:17:38.773093 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" event={"ID":"aabceec8-9509-4d66-af3e-1b9d9a270b38","Type":"ContainerStarted","Data":"fc98b3247d367873c8b39b5266d34aa4f81df41b76a68c0b43809be0222bc3c7"} Dec 03 11:17:40 crc kubenswrapper[4702]: I1203 11:17:40.791953 4702 generic.go:334] "Generic (PLEG): container finished" podID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerID="8702edcb4f0809017858788ff64a4e079ef0970f00b3ab26854926d94d1d9647" exitCode=0 Dec 03 11:17:40 crc kubenswrapper[4702]: I1203 11:17:40.792058 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" event={"ID":"be1348c4-10f9-4f68-9ade-51ff820cd05a","Type":"ContainerDied","Data":"8702edcb4f0809017858788ff64a4e079ef0970f00b3ab26854926d94d1d9647"} Dec 03 11:17:40 crc kubenswrapper[4702]: I1203 11:17:40.796372 4702 generic.go:334] "Generic (PLEG): container finished" podID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerID="86899c6984283006bfcd9616d3e9206a3464df4368b865019ec19126713755fd" exitCode=0 Dec 03 11:17:40 crc kubenswrapper[4702]: I1203 11:17:40.796410 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" event={"ID":"aabceec8-9509-4d66-af3e-1b9d9a270b38","Type":"ContainerDied","Data":"86899c6984283006bfcd9616d3e9206a3464df4368b865019ec19126713755fd"} Dec 03 11:17:41 crc kubenswrapper[4702]: I1203 11:17:41.807285 4702 generic.go:334] "Generic (PLEG): container finished" podID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerID="bca734f14e19a7061af12b9233114475fbc3ea59b4b0dfd0b1d937b25e366915" exitCode=0 Dec 03 11:17:41 crc kubenswrapper[4702]: I1203 11:17:41.807357 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" event={"ID":"be1348c4-10f9-4f68-9ade-51ff820cd05a","Type":"ContainerDied","Data":"bca734f14e19a7061af12b9233114475fbc3ea59b4b0dfd0b1d937b25e366915"} Dec 03 11:17:41 crc kubenswrapper[4702]: I1203 11:17:41.810333 4702 generic.go:334] "Generic (PLEG): container finished" podID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerID="b74ee677e862366cd5c81e61e9119cf781be314491c90426882f30235bf1f786" exitCode=0 Dec 03 11:17:41 crc kubenswrapper[4702]: I1203 11:17:41.810379 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" event={"ID":"aabceec8-9509-4d66-af3e-1b9d9a270b38","Type":"ContainerDied","Data":"b74ee677e862366cd5c81e61e9119cf781be314491c90426882f30235bf1f786"} Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.485667 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.489522 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.584598 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-util\") pod \"aabceec8-9509-4d66-af3e-1b9d9a270b38\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.584721 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-bundle\") pod \"aabceec8-9509-4d66-af3e-1b9d9a270b38\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.584793 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndk6m\" (UniqueName: \"kubernetes.io/projected/be1348c4-10f9-4f68-9ade-51ff820cd05a-kube-api-access-ndk6m\") pod \"be1348c4-10f9-4f68-9ade-51ff820cd05a\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.584839 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-bundle\") pod \"be1348c4-10f9-4f68-9ade-51ff820cd05a\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.584899 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85t4x\" (UniqueName: \"kubernetes.io/projected/aabceec8-9509-4d66-af3e-1b9d9a270b38-kube-api-access-85t4x\") pod \"aabceec8-9509-4d66-af3e-1b9d9a270b38\" (UID: \"aabceec8-9509-4d66-af3e-1b9d9a270b38\") " Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.584926 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-util\") pod \"be1348c4-10f9-4f68-9ade-51ff820cd05a\" (UID: \"be1348c4-10f9-4f68-9ade-51ff820cd05a\") " Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.586310 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-bundle" (OuterVolumeSpecName: "bundle") pod "aabceec8-9509-4d66-af3e-1b9d9a270b38" (UID: "aabceec8-9509-4d66-af3e-1b9d9a270b38"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.586364 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-bundle" (OuterVolumeSpecName: "bundle") pod "be1348c4-10f9-4f68-9ade-51ff820cd05a" (UID: "be1348c4-10f9-4f68-9ade-51ff820cd05a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.592635 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be1348c4-10f9-4f68-9ade-51ff820cd05a-kube-api-access-ndk6m" (OuterVolumeSpecName: "kube-api-access-ndk6m") pod "be1348c4-10f9-4f68-9ade-51ff820cd05a" (UID: "be1348c4-10f9-4f68-9ade-51ff820cd05a"). InnerVolumeSpecName "kube-api-access-ndk6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.593120 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aabceec8-9509-4d66-af3e-1b9d9a270b38-kube-api-access-85t4x" (OuterVolumeSpecName: "kube-api-access-85t4x") pod "aabceec8-9509-4d66-af3e-1b9d9a270b38" (UID: "aabceec8-9509-4d66-af3e-1b9d9a270b38"). InnerVolumeSpecName "kube-api-access-85t4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.688106 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85t4x\" (UniqueName: \"kubernetes.io/projected/aabceec8-9509-4d66-af3e-1b9d9a270b38-kube-api-access-85t4x\") on node \"crc\" DevicePath \"\"" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.688143 4702 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.688164 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndk6m\" (UniqueName: \"kubernetes.io/projected/be1348c4-10f9-4f68-9ade-51ff820cd05a-kube-api-access-ndk6m\") on node \"crc\" DevicePath \"\"" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.688173 4702 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.826624 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" event={"ID":"aabceec8-9509-4d66-af3e-1b9d9a270b38","Type":"ContainerDied","Data":"fc98b3247d367873c8b39b5266d34aa4f81df41b76a68c0b43809be0222bc3c7"} Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.827005 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc98b3247d367873c8b39b5266d34aa4f81df41b76a68c0b43809be0222bc3c7" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.826652 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.829128 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" event={"ID":"be1348c4-10f9-4f68-9ade-51ff820cd05a","Type":"ContainerDied","Data":"3deabeb6a93c6b620f73017488df17f29e0ec695e634c73cddd6d3a4992a8f9f"} Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.829153 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3deabeb6a93c6b620f73017488df17f29e0ec695e634c73cddd6d3a4992a8f9f" Dec 03 11:17:43 crc kubenswrapper[4702]: I1203 11:17:43.829254 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx" Dec 03 11:17:44 crc kubenswrapper[4702]: I1203 11:17:44.172670 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-util" (OuterVolumeSpecName: "util") pod "be1348c4-10f9-4f68-9ade-51ff820cd05a" (UID: "be1348c4-10f9-4f68-9ade-51ff820cd05a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:17:44 crc kubenswrapper[4702]: I1203 11:17:44.187921 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-util" (OuterVolumeSpecName: "util") pod "aabceec8-9509-4d66-af3e-1b9d9a270b38" (UID: "aabceec8-9509-4d66-af3e-1b9d9a270b38"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:17:44 crc kubenswrapper[4702]: I1203 11:17:44.197178 4702 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aabceec8-9509-4d66-af3e-1b9d9a270b38-util\") on node \"crc\" DevicePath \"\"" Dec 03 11:17:44 crc kubenswrapper[4702]: I1203 11:17:44.197226 4702 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/be1348c4-10f9-4f68-9ade-51ff820cd05a-util\") on node \"crc\" DevicePath \"\"" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.311033 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5vwrv"] Dec 03 11:17:55 crc kubenswrapper[4702]: E1203 11:17:55.312203 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerName="pull" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.312220 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerName="pull" Dec 03 11:17:55 crc kubenswrapper[4702]: E1203 11:17:55.312236 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerName="extract" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.312244 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerName="extract" Dec 03 11:17:55 crc kubenswrapper[4702]: E1203 11:17:55.312256 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerName="pull" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.312264 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerName="pull" Dec 03 11:17:55 crc kubenswrapper[4702]: E1203 11:17:55.312288 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerName="util" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.312296 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerName="util" Dec 03 11:17:55 crc kubenswrapper[4702]: E1203 11:17:55.312308 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerName="util" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.312316 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerName="util" Dec 03 11:17:55 crc kubenswrapper[4702]: E1203 11:17:55.312329 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerName="extract" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.312336 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerName="extract" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.312510 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="be1348c4-10f9-4f68-9ade-51ff820cd05a" containerName="extract" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.312524 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="aabceec8-9509-4d66-af3e-1b9d9a270b38" containerName="extract" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.313691 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.335960 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vwrv"] Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.399125 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-utilities\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.399979 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-catalog-content\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.400112 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swjl6\" (UniqueName: \"kubernetes.io/projected/26d7e166-053a-4fac-8151-b337aaa2aac6-kube-api-access-swjl6\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.502042 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-catalog-content\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.502087 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swjl6\" (UniqueName: \"kubernetes.io/projected/26d7e166-053a-4fac-8151-b337aaa2aac6-kube-api-access-swjl6\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.502158 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-utilities\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.502604 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-catalog-content\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.502680 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-utilities\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.522198 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swjl6\" (UniqueName: \"kubernetes.io/projected/26d7e166-053a-4fac-8151-b337aaa2aac6-kube-api-access-swjl6\") pod \"community-operators-5vwrv\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.587579 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79"] Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.588787 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.590534 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-zjbhh" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.590537 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.591116 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.591745 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.592213 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.598896 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.613863 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79"] Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.683098 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.705334 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgk9l\" (UniqueName: \"kubernetes.io/projected/672e4a37-26c7-4378-a524-57fba88aec53-kube-api-access-rgk9l\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.705403 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.705553 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-webhook-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.705666 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-apiservice-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.705747 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/672e4a37-26c7-4378-a524-57fba88aec53-manager-config\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.818082 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-apiservice-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.818169 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/672e4a37-26c7-4378-a524-57fba88aec53-manager-config\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.818257 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgk9l\" (UniqueName: \"kubernetes.io/projected/672e4a37-26c7-4378-a524-57fba88aec53-kube-api-access-rgk9l\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.818282 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.818320 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-webhook-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.820314 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/672e4a37-26c7-4378-a524-57fba88aec53-manager-config\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.829610 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-apiservice-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.830286 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-webhook-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.830420 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/672e4a37-26c7-4378-a524-57fba88aec53-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.839505 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgk9l\" (UniqueName: \"kubernetes.io/projected/672e4a37-26c7-4378-a524-57fba88aec53-kube-api-access-rgk9l\") pod \"loki-operator-controller-manager-5688675f7c-q6w79\" (UID: \"672e4a37-26c7-4378-a524-57fba88aec53\") " pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.907194 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.917617 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.917966 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.918029 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.918869 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77511b1fe86d0b6a8fc57584220c3eed3f6f18f178f53eb56a078f9c63c86b1e"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:17:55 crc kubenswrapper[4702]: I1203 11:17:55.918927 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://77511b1fe86d0b6a8fc57584220c3eed3f6f18f178f53eb56a078f9c63c86b1e" gracePeriod=600 Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.012625 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-g4d4w"] Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.014113 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-g4d4w" Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.016849 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-4k9zw" Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.017216 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.017726 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.021147 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-g4d4w"] Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.074837 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5vwrv"] Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.122179 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbp9\" (UniqueName: \"kubernetes.io/projected/7f8d827c-68d7-4bd7-9934-6cdcd1b7059e-kube-api-access-7bbp9\") pod \"cluster-logging-operator-ff9846bd-g4d4w\" (UID: \"7f8d827c-68d7-4bd7-9934-6cdcd1b7059e\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-g4d4w" Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.223771 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbp9\" (UniqueName: \"kubernetes.io/projected/7f8d827c-68d7-4bd7-9934-6cdcd1b7059e-kube-api-access-7bbp9\") pod \"cluster-logging-operator-ff9846bd-g4d4w\" (UID: \"7f8d827c-68d7-4bd7-9934-6cdcd1b7059e\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-g4d4w" Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.423705 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79"] Dec 03 11:17:56 crc kubenswrapper[4702]: W1203 11:17:56.428820 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod672e4a37_26c7_4378_a524_57fba88aec53.slice/crio-71fe8395b895c4dcb70927ea4d2f23ff32fc6b8f3a705bfba16ef07b54c07e72 WatchSource:0}: Error finding container 71fe8395b895c4dcb70927ea4d2f23ff32fc6b8f3a705bfba16ef07b54c07e72: Status 404 returned error can't find the container with id 71fe8395b895c4dcb70927ea4d2f23ff32fc6b8f3a705bfba16ef07b54c07e72 Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.665719 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbp9\" (UniqueName: \"kubernetes.io/projected/7f8d827c-68d7-4bd7-9934-6cdcd1b7059e-kube-api-access-7bbp9\") pod \"cluster-logging-operator-ff9846bd-g4d4w\" (UID: \"7f8d827c-68d7-4bd7-9934-6cdcd1b7059e\") " pod="openshift-logging/cluster-logging-operator-ff9846bd-g4d4w" Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.914512 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vwrv" event={"ID":"26d7e166-053a-4fac-8151-b337aaa2aac6","Type":"ContainerStarted","Data":"b07a8b087bd25d0d95359aedb358fc61606e80e3a7ae916819c21ca1c004b46d"} Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.917217 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="77511b1fe86d0b6a8fc57584220c3eed3f6f18f178f53eb56a078f9c63c86b1e" exitCode=0 Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.917305 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"77511b1fe86d0b6a8fc57584220c3eed3f6f18f178f53eb56a078f9c63c86b1e"} Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.917356 4702 scope.go:117] "RemoveContainer" containerID="58c8f87f05ee64e1d869e097387b492d719b507df8e2586f56cddcf068af149d" Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.920659 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" event={"ID":"672e4a37-26c7-4378-a524-57fba88aec53","Type":"ContainerStarted","Data":"71fe8395b895c4dcb70927ea4d2f23ff32fc6b8f3a705bfba16ef07b54c07e72"} Dec 03 11:17:56 crc kubenswrapper[4702]: I1203 11:17:56.942438 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-ff9846bd-g4d4w" Dec 03 11:17:57 crc kubenswrapper[4702]: W1203 11:17:57.902168 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8d827c_68d7_4bd7_9934_6cdcd1b7059e.slice/crio-37127e2a033a26028a61303433b401542443125fc7730e48a64a8d4c1eb26d3b WatchSource:0}: Error finding container 37127e2a033a26028a61303433b401542443125fc7730e48a64a8d4c1eb26d3b: Status 404 returned error can't find the container with id 37127e2a033a26028a61303433b401542443125fc7730e48a64a8d4c1eb26d3b Dec 03 11:17:57 crc kubenswrapper[4702]: I1203 11:17:57.903864 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-ff9846bd-g4d4w"] Dec 03 11:17:57 crc kubenswrapper[4702]: I1203 11:17:57.933818 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-g4d4w" event={"ID":"7f8d827c-68d7-4bd7-9934-6cdcd1b7059e","Type":"ContainerStarted","Data":"37127e2a033a26028a61303433b401542443125fc7730e48a64a8d4c1eb26d3b"} Dec 03 11:17:57 crc kubenswrapper[4702]: I1203 11:17:57.939579 4702 generic.go:334] "Generic (PLEG): container finished" podID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerID="70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c" exitCode=0 Dec 03 11:17:57 crc kubenswrapper[4702]: I1203 11:17:57.939727 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vwrv" event={"ID":"26d7e166-053a-4fac-8151-b337aaa2aac6","Type":"ContainerDied","Data":"70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c"} Dec 03 11:17:57 crc kubenswrapper[4702]: I1203 11:17:57.945744 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"1331b116dc6549b4ff553b30c2ad8688204f150ccd437e40dc657307e56d7aa3"} Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.311031 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n78qm"] Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.313113 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.324219 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n78qm"] Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.390867 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-catalog-content\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.390947 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-utilities\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.390969 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xv46\" (UniqueName: \"kubernetes.io/projected/8caaee6f-9680-4e4d-97f0-81626914304a-kube-api-access-7xv46\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.493087 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-catalog-content\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.493199 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-utilities\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.493233 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xv46\" (UniqueName: \"kubernetes.io/projected/8caaee6f-9680-4e4d-97f0-81626914304a-kube-api-access-7xv46\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.493832 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-catalog-content\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.494096 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-utilities\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.530055 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xv46\" (UniqueName: \"kubernetes.io/projected/8caaee6f-9680-4e4d-97f0-81626914304a-kube-api-access-7xv46\") pod \"redhat-marketplace-n78qm\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.630156 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:17:59 crc kubenswrapper[4702]: I1203 11:17:59.980885 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vwrv" event={"ID":"26d7e166-053a-4fac-8151-b337aaa2aac6","Type":"ContainerStarted","Data":"bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57"} Dec 03 11:18:00 crc kubenswrapper[4702]: I1203 11:18:00.998863 4702 generic.go:334] "Generic (PLEG): container finished" podID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerID="bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57" exitCode=0 Dec 03 11:18:00 crc kubenswrapper[4702]: I1203 11:18:00.998974 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vwrv" event={"ID":"26d7e166-053a-4fac-8151-b337aaa2aac6","Type":"ContainerDied","Data":"bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57"} Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.119824 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-866zh"] Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.122306 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.132704 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-866zh"] Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.384561 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65qjz\" (UniqueName: \"kubernetes.io/projected/c3208263-39dd-4164-a165-85ef3dc127e8-kube-api-access-65qjz\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.384662 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-catalog-content\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.384738 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-utilities\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.486532 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65qjz\" (UniqueName: \"kubernetes.io/projected/c3208263-39dd-4164-a165-85ef3dc127e8-kube-api-access-65qjz\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.486973 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-catalog-content\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.487033 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-utilities\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.487671 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-utilities\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.487722 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-catalog-content\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.529059 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65qjz\" (UniqueName: \"kubernetes.io/projected/c3208263-39dd-4164-a165-85ef3dc127e8-kube-api-access-65qjz\") pod \"certified-operators-866zh\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:08 crc kubenswrapper[4702]: I1203 11:18:08.805240 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:13 crc kubenswrapper[4702]: I1203 11:18:13.065530 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-866zh"] Dec 03 11:18:13 crc kubenswrapper[4702]: W1203 11:18:13.078052 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3208263_39dd_4164_a165_85ef3dc127e8.slice/crio-c8caacbf9276a150664dc059aaed75072e6bfd158b2c7cd3b51022c7c170fbe0 WatchSource:0}: Error finding container c8caacbf9276a150664dc059aaed75072e6bfd158b2c7cd3b51022c7c170fbe0: Status 404 returned error can't find the container with id c8caacbf9276a150664dc059aaed75072e6bfd158b2c7cd3b51022c7c170fbe0 Dec 03 11:18:13 crc kubenswrapper[4702]: E1203 11:18:13.087615 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift-logging/loki-rhel9-operator@sha256:ce1c37005ef2613c22343b3841f495597103f0831839d017f8a9dad8a00275c6" Dec 03 11:18:13 crc kubenswrapper[4702]: E1203 11:18:13.088148 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:registry.redhat.io/openshift-logging/loki-rhel9-operator@sha256:ce1c37005ef2613c22343b3841f495597103f0831839d017f8a9dad8a00275c6,Command:[/manager],Args:[--config=controller_manager_config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},ContainerPort{Name:metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELATED_IMAGE_LOKI,Value:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:1bd60df77d8be8eae3551f68a3a55a464610be839b0c0556600c7f1a36887919,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GATEWAY,Value:registry.redhat.io/openshift-logging/lokistack-gateway-rhel9@sha256:ad8b5cee31875481991739c882fd765f01d0391184de7e2e19e8394efda5a4d6,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPA,Value:registry.redhat.io/openshift-logging/opa-openshift-rhel9@sha256:62c2e9714d7de73861ca4a607a9509ddc9771399569015a5ad8f8b42639ebaae,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:loki-operator.v6.2.6,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:manager-config,ReadOnly:false,MountPath:/controller_manager_config.yaml,SubPath:controller_manager_config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgk9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000700000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod loki-operator-controller-manager-5688675f7c-q6w79_openshift-operators-redhat(672e4a37-26c7-4378-a524-57fba88aec53): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:18:13 crc kubenswrapper[4702]: I1203 11:18:13.100939 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-866zh" event={"ID":"c3208263-39dd-4164-a165-85ef3dc127e8","Type":"ContainerStarted","Data":"c8caacbf9276a150664dc059aaed75072e6bfd158b2c7cd3b51022c7c170fbe0"} Dec 03 11:18:13 crc kubenswrapper[4702]: I1203 11:18:13.223258 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n78qm"] Dec 03 11:18:14 crc kubenswrapper[4702]: I1203 11:18:14.119386 4702 generic.go:334] "Generic (PLEG): container finished" podID="c3208263-39dd-4164-a165-85ef3dc127e8" containerID="75246fb18cf38804902d15e6a6f4e3d601b80ffd06f40d2aae39d5213feb0be0" exitCode=0 Dec 03 11:18:14 crc kubenswrapper[4702]: I1203 11:18:14.119514 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-866zh" event={"ID":"c3208263-39dd-4164-a165-85ef3dc127e8","Type":"ContainerDied","Data":"75246fb18cf38804902d15e6a6f4e3d601b80ffd06f40d2aae39d5213feb0be0"} Dec 03 11:18:16 crc kubenswrapper[4702]: I1203 11:18:16.141012 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-ff9846bd-g4d4w" event={"ID":"7f8d827c-68d7-4bd7-9934-6cdcd1b7059e","Type":"ContainerStarted","Data":"3c61181099bd53a73ceb88b27e862f5d16a527a11000af1dc01d9292631bd587"} Dec 03 11:18:16 crc kubenswrapper[4702]: I1203 11:18:16.144804 4702 generic.go:334] "Generic (PLEG): container finished" podID="8caaee6f-9680-4e4d-97f0-81626914304a" containerID="3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878" exitCode=0 Dec 03 11:18:16 crc kubenswrapper[4702]: I1203 11:18:16.144891 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n78qm" event={"ID":"8caaee6f-9680-4e4d-97f0-81626914304a","Type":"ContainerDied","Data":"3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878"} Dec 03 11:18:16 crc kubenswrapper[4702]: I1203 11:18:16.144937 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n78qm" event={"ID":"8caaee6f-9680-4e4d-97f0-81626914304a","Type":"ContainerStarted","Data":"f6a402e1a74f56d270caff9e0dd092f80c36fbc87b2b1117ca6ca8d648fdd638"} Dec 03 11:18:16 crc kubenswrapper[4702]: I1203 11:18:16.147835 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vwrv" event={"ID":"26d7e166-053a-4fac-8151-b337aaa2aac6","Type":"ContainerStarted","Data":"8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86"} Dec 03 11:18:16 crc kubenswrapper[4702]: I1203 11:18:16.208501 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5vwrv" podStartSLOduration=3.359465776 podStartE2EDuration="21.208479187s" podCreationTimestamp="2025-12-03 11:17:55 +0000 UTC" firstStartedPulling="2025-12-03 11:17:57.94295269 +0000 UTC m=+861.778881154" lastFinishedPulling="2025-12-03 11:18:15.791966101 +0000 UTC m=+879.627894565" observedRunningTime="2025-12-03 11:18:16.203244968 +0000 UTC m=+880.039173432" watchObservedRunningTime="2025-12-03 11:18:16.208479187 +0000 UTC m=+880.044407651" Dec 03 11:18:16 crc kubenswrapper[4702]: I1203 11:18:16.208659 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-ff9846bd-g4d4w" podStartSLOduration=3.267536583 podStartE2EDuration="21.208654082s" podCreationTimestamp="2025-12-03 11:17:55 +0000 UTC" firstStartedPulling="2025-12-03 11:17:57.90729419 +0000 UTC m=+861.743222654" lastFinishedPulling="2025-12-03 11:18:15.848411689 +0000 UTC m=+879.684340153" observedRunningTime="2025-12-03 11:18:16.173896037 +0000 UTC m=+880.009824501" watchObservedRunningTime="2025-12-03 11:18:16.208654082 +0000 UTC m=+880.044582546" Dec 03 11:18:17 crc kubenswrapper[4702]: I1203 11:18:17.158154 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-866zh" event={"ID":"c3208263-39dd-4164-a165-85ef3dc127e8","Type":"ContainerStarted","Data":"b614270270d940d6d91b375c88647742ee6dd758ae71edae4f839f27f0f0e5c1"} Dec 03 11:18:18 crc kubenswrapper[4702]: I1203 11:18:18.207420 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n78qm" event={"ID":"8caaee6f-9680-4e4d-97f0-81626914304a","Type":"ContainerStarted","Data":"85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71"} Dec 03 11:18:18 crc kubenswrapper[4702]: I1203 11:18:18.395712 4702 generic.go:334] "Generic (PLEG): container finished" podID="c3208263-39dd-4164-a165-85ef3dc127e8" containerID="b614270270d940d6d91b375c88647742ee6dd758ae71edae4f839f27f0f0e5c1" exitCode=0 Dec 03 11:18:18 crc kubenswrapper[4702]: I1203 11:18:18.395801 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-866zh" event={"ID":"c3208263-39dd-4164-a165-85ef3dc127e8","Type":"ContainerDied","Data":"b614270270d940d6d91b375c88647742ee6dd758ae71edae4f839f27f0f0e5c1"} Dec 03 11:18:19 crc kubenswrapper[4702]: I1203 11:18:19.404588 4702 generic.go:334] "Generic (PLEG): container finished" podID="8caaee6f-9680-4e4d-97f0-81626914304a" containerID="85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71" exitCode=0 Dec 03 11:18:19 crc kubenswrapper[4702]: I1203 11:18:19.404630 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n78qm" event={"ID":"8caaee6f-9680-4e4d-97f0-81626914304a","Type":"ContainerDied","Data":"85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71"} Dec 03 11:18:25 crc kubenswrapper[4702]: I1203 11:18:25.683869 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:18:25 crc kubenswrapper[4702]: I1203 11:18:25.684553 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:18:25 crc kubenswrapper[4702]: I1203 11:18:25.743206 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:18:26 crc kubenswrapper[4702]: I1203 11:18:26.584701 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:18:26 crc kubenswrapper[4702]: I1203 11:18:26.629504 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vwrv"] Dec 03 11:18:27 crc kubenswrapper[4702]: E1203 11:18:27.536020 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" podUID="672e4a37-26c7-4378-a524-57fba88aec53" Dec 03 11:18:27 crc kubenswrapper[4702]: I1203 11:18:27.545001 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n78qm" event={"ID":"8caaee6f-9680-4e4d-97f0-81626914304a","Type":"ContainerStarted","Data":"e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139"} Dec 03 11:18:27 crc kubenswrapper[4702]: I1203 11:18:27.551159 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-866zh" event={"ID":"c3208263-39dd-4164-a165-85ef3dc127e8","Type":"ContainerStarted","Data":"98187aee9376936ee766abc77e26a736390e9ffa182c85a1dbb3d1978a84736f"} Dec 03 11:18:27 crc kubenswrapper[4702]: I1203 11:18:27.553198 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" event={"ID":"672e4a37-26c7-4378-a524-57fba88aec53","Type":"ContainerStarted","Data":"9236041b67677935c56359ef05fcfbde6d63ec5689a6ef83b0f63a3b07e8346a"} Dec 03 11:18:27 crc kubenswrapper[4702]: I1203 11:18:27.581960 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n78qm" podStartSLOduration=17.478603145 podStartE2EDuration="28.570060027s" podCreationTimestamp="2025-12-03 11:17:59 +0000 UTC" firstStartedPulling="2025-12-03 11:18:16.165004305 +0000 UTC m=+880.000932769" lastFinishedPulling="2025-12-03 11:18:27.256461187 +0000 UTC m=+891.092389651" observedRunningTime="2025-12-03 11:18:27.565978342 +0000 UTC m=+891.401906816" watchObservedRunningTime="2025-12-03 11:18:27.570060027 +0000 UTC m=+891.405988491" Dec 03 11:18:27 crc kubenswrapper[4702]: I1203 11:18:27.602179 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-866zh" podStartSLOduration=7.870793302 podStartE2EDuration="19.602149426s" podCreationTimestamp="2025-12-03 11:18:08 +0000 UTC" firstStartedPulling="2025-12-03 11:18:15.526339038 +0000 UTC m=+879.362267502" lastFinishedPulling="2025-12-03 11:18:27.257695162 +0000 UTC m=+891.093623626" observedRunningTime="2025-12-03 11:18:27.586273096 +0000 UTC m=+891.422201580" watchObservedRunningTime="2025-12-03 11:18:27.602149426 +0000 UTC m=+891.438077910" Dec 03 11:18:28 crc kubenswrapper[4702]: I1203 11:18:28.564410 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" event={"ID":"672e4a37-26c7-4378-a524-57fba88aec53","Type":"ContainerStarted","Data":"32a69511f4925e8eb4a71c4bc1a97060fed7d4e0e1a027fbb60f458f02518a79"} Dec 03 11:18:28 crc kubenswrapper[4702]: I1203 11:18:28.565029 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5vwrv" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerName="registry-server" containerID="cri-o://8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86" gracePeriod=2 Dec 03 11:18:28 crc kubenswrapper[4702]: I1203 11:18:28.565794 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:18:28 crc kubenswrapper[4702]: I1203 11:18:28.598433 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" podStartSLOduration=1.808916313 podStartE2EDuration="33.598404231s" podCreationTimestamp="2025-12-03 11:17:55 +0000 UTC" firstStartedPulling="2025-12-03 11:17:56.43348094 +0000 UTC m=+860.269409404" lastFinishedPulling="2025-12-03 11:18:28.222968868 +0000 UTC m=+892.058897322" observedRunningTime="2025-12-03 11:18:28.592289648 +0000 UTC m=+892.428218112" watchObservedRunningTime="2025-12-03 11:18:28.598404231 +0000 UTC m=+892.434332695" Dec 03 11:18:28 crc kubenswrapper[4702]: I1203 11:18:28.807374 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:28 crc kubenswrapper[4702]: I1203 11:18:28.807444 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.109260 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.116780 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-utilities\") pod \"26d7e166-053a-4fac-8151-b337aaa2aac6\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.116912 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swjl6\" (UniqueName: \"kubernetes.io/projected/26d7e166-053a-4fac-8151-b337aaa2aac6-kube-api-access-swjl6\") pod \"26d7e166-053a-4fac-8151-b337aaa2aac6\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.116942 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-catalog-content\") pod \"26d7e166-053a-4fac-8151-b337aaa2aac6\" (UID: \"26d7e166-053a-4fac-8151-b337aaa2aac6\") " Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.123998 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d7e166-053a-4fac-8151-b337aaa2aac6-kube-api-access-swjl6" (OuterVolumeSpecName: "kube-api-access-swjl6") pod "26d7e166-053a-4fac-8151-b337aaa2aac6" (UID: "26d7e166-053a-4fac-8151-b337aaa2aac6"). InnerVolumeSpecName "kube-api-access-swjl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.128486 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-utilities" (OuterVolumeSpecName: "utilities") pod "26d7e166-053a-4fac-8151-b337aaa2aac6" (UID: "26d7e166-053a-4fac-8151-b337aaa2aac6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.176114 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26d7e166-053a-4fac-8151-b337aaa2aac6" (UID: "26d7e166-053a-4fac-8151-b337aaa2aac6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.218941 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swjl6\" (UniqueName: \"kubernetes.io/projected/26d7e166-053a-4fac-8151-b337aaa2aac6-kube-api-access-swjl6\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.218995 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.219005 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26d7e166-053a-4fac-8151-b337aaa2aac6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.572025 4702 generic.go:334] "Generic (PLEG): container finished" podID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerID="8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86" exitCode=0 Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.572099 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5vwrv" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.572085 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vwrv" event={"ID":"26d7e166-053a-4fac-8151-b337aaa2aac6","Type":"ContainerDied","Data":"8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86"} Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.572160 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5vwrv" event={"ID":"26d7e166-053a-4fac-8151-b337aaa2aac6","Type":"ContainerDied","Data":"b07a8b087bd25d0d95359aedb358fc61606e80e3a7ae916819c21ca1c004b46d"} Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.572182 4702 scope.go:117] "RemoveContainer" containerID="8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.599299 4702 scope.go:117] "RemoveContainer" containerID="bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.606782 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5vwrv"] Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.611227 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5vwrv"] Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.631237 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.631626 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.641729 4702 scope.go:117] "RemoveContainer" containerID="70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.659163 4702 scope.go:117] "RemoveContainer" containerID="8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86" Dec 03 11:18:29 crc kubenswrapper[4702]: E1203 11:18:29.660220 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86\": container with ID starting with 8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86 not found: ID does not exist" containerID="8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.660267 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86"} err="failed to get container status \"8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86\": rpc error: code = NotFound desc = could not find container \"8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86\": container with ID starting with 8c7f7ca512f9a07051fe72499247b9c33dbbb3a6dfb19d4fc9653afbdfdcee86 not found: ID does not exist" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.660298 4702 scope.go:117] "RemoveContainer" containerID="bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57" Dec 03 11:18:29 crc kubenswrapper[4702]: E1203 11:18:29.660970 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57\": container with ID starting with bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57 not found: ID does not exist" containerID="bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.661032 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57"} err="failed to get container status \"bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57\": rpc error: code = NotFound desc = could not find container \"bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57\": container with ID starting with bb17e5a8ea6756f8b8f4bacdce4db9bf01a493623e42ee6ccf59a5d758940a57 not found: ID does not exist" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.661105 4702 scope.go:117] "RemoveContainer" containerID="70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c" Dec 03 11:18:29 crc kubenswrapper[4702]: E1203 11:18:29.661536 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c\": container with ID starting with 70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c not found: ID does not exist" containerID="70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.661569 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c"} err="failed to get container status \"70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c\": rpc error: code = NotFound desc = could not find container \"70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c\": container with ID starting with 70ea9b6a68008f11e94f935b8f1fed34c2532ed33147c308cb1d856d2238579c not found: ID does not exist" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.692151 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:18:29 crc kubenswrapper[4702]: I1203 11:18:29.857606 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-866zh" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="registry-server" probeResult="failure" output=< Dec 03 11:18:29 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:18:29 crc kubenswrapper[4702]: > Dec 03 11:18:30 crc kubenswrapper[4702]: I1203 11:18:30.941304 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" path="/var/lib/kubelet/pods/26d7e166-053a-4fac-8151-b337aaa2aac6/volumes" Dec 03 11:18:35 crc kubenswrapper[4702]: I1203 11:18:35.910679 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 11:18:38 crc kubenswrapper[4702]: I1203 11:18:38.866638 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:38 crc kubenswrapper[4702]: I1203 11:18:38.916322 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:39 crc kubenswrapper[4702]: I1203 11:18:39.318403 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-866zh"] Dec 03 11:18:39 crc kubenswrapper[4702]: I1203 11:18:39.692271 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.653288 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-866zh" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="registry-server" containerID="cri-o://98187aee9376936ee766abc77e26a736390e9ffa182c85a1dbb3d1978a84736f" gracePeriod=2 Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.902984 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Dec 03 11:18:40 crc kubenswrapper[4702]: E1203 11:18:40.903456 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerName="registry-server" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.903481 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerName="registry-server" Dec 03 11:18:40 crc kubenswrapper[4702]: E1203 11:18:40.903511 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerName="extract-content" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.903520 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerName="extract-content" Dec 03 11:18:40 crc kubenswrapper[4702]: E1203 11:18:40.903533 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerName="extract-utilities" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.903542 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerName="extract-utilities" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.903719 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="26d7e166-053a-4fac-8151-b337aaa2aac6" containerName="registry-server" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.904399 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.907053 4702 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-j6lbn" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.907743 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.908825 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.910726 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.982213 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-811bff7e-9d2c-4631-a8e7-3adcba9d2f85\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-811bff7e-9d2c-4631-a8e7-3adcba9d2f85\") pod \"minio\" (UID: \"e16af177-16b0-40f2-a044-28684d0a15e8\") " pod="minio-dev/minio" Dec 03 11:18:40 crc kubenswrapper[4702]: I1203 11:18:40.982288 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvn95\" (UniqueName: \"kubernetes.io/projected/e16af177-16b0-40f2-a044-28684d0a15e8-kube-api-access-qvn95\") pod \"minio\" (UID: \"e16af177-16b0-40f2-a044-28684d0a15e8\") " pod="minio-dev/minio" Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.083639 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-811bff7e-9d2c-4631-a8e7-3adcba9d2f85\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-811bff7e-9d2c-4631-a8e7-3adcba9d2f85\") pod \"minio\" (UID: \"e16af177-16b0-40f2-a044-28684d0a15e8\") " pod="minio-dev/minio" Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.083687 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvn95\" (UniqueName: \"kubernetes.io/projected/e16af177-16b0-40f2-a044-28684d0a15e8-kube-api-access-qvn95\") pod \"minio\" (UID: \"e16af177-16b0-40f2-a044-28684d0a15e8\") " pod="minio-dev/minio" Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.087966 4702 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.088013 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-811bff7e-9d2c-4631-a8e7-3adcba9d2f85\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-811bff7e-9d2c-4631-a8e7-3adcba9d2f85\") pod \"minio\" (UID: \"e16af177-16b0-40f2-a044-28684d0a15e8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/d8e7e13f6ea32b62f374695a6d153a10cc55fb8c804d07f9bcefdff79d4567e9/globalmount\"" pod="minio-dev/minio" Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.105850 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvn95\" (UniqueName: \"kubernetes.io/projected/e16af177-16b0-40f2-a044-28684d0a15e8-kube-api-access-qvn95\") pod \"minio\" (UID: \"e16af177-16b0-40f2-a044-28684d0a15e8\") " pod="minio-dev/minio" Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.118356 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-811bff7e-9d2c-4631-a8e7-3adcba9d2f85\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-811bff7e-9d2c-4631-a8e7-3adcba9d2f85\") pod \"minio\" (UID: \"e16af177-16b0-40f2-a044-28684d0a15e8\") " pod="minio-dev/minio" Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.224919 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.665070 4702 generic.go:334] "Generic (PLEG): container finished" podID="c3208263-39dd-4164-a165-85ef3dc127e8" containerID="98187aee9376936ee766abc77e26a736390e9ffa182c85a1dbb3d1978a84736f" exitCode=0 Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.665161 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-866zh" event={"ID":"c3208263-39dd-4164-a165-85ef3dc127e8","Type":"ContainerDied","Data":"98187aee9376936ee766abc77e26a736390e9ffa182c85a1dbb3d1978a84736f"} Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.722405 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n78qm"] Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.722775 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n78qm" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" containerName="registry-server" containerID="cri-o://e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139" gracePeriod=2 Dec 03 11:18:41 crc kubenswrapper[4702]: I1203 11:18:41.734422 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Dec 03 11:18:41 crc kubenswrapper[4702]: W1203 11:18:41.735631 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16af177_16b0_40f2_a044_28684d0a15e8.slice/crio-71e6b5e66d35db91d02e07ddc760f0ab4b79893d2bd7a7a0e11b6263f452baaf WatchSource:0}: Error finding container 71e6b5e66d35db91d02e07ddc760f0ab4b79893d2bd7a7a0e11b6263f452baaf: Status 404 returned error can't find the container with id 71e6b5e66d35db91d02e07ddc760f0ab4b79893d2bd7a7a0e11b6263f452baaf Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.220227 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.307970 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xv46\" (UniqueName: \"kubernetes.io/projected/8caaee6f-9680-4e4d-97f0-81626914304a-kube-api-access-7xv46\") pod \"8caaee6f-9680-4e4d-97f0-81626914304a\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.308080 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-catalog-content\") pod \"8caaee6f-9680-4e4d-97f0-81626914304a\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.308137 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-utilities\") pod \"8caaee6f-9680-4e4d-97f0-81626914304a\" (UID: \"8caaee6f-9680-4e4d-97f0-81626914304a\") " Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.310009 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-utilities" (OuterVolumeSpecName: "utilities") pod "8caaee6f-9680-4e4d-97f0-81626914304a" (UID: "8caaee6f-9680-4e4d-97f0-81626914304a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.312665 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.315830 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8caaee6f-9680-4e4d-97f0-81626914304a-kube-api-access-7xv46" (OuterVolumeSpecName: "kube-api-access-7xv46") pod "8caaee6f-9680-4e4d-97f0-81626914304a" (UID: "8caaee6f-9680-4e4d-97f0-81626914304a"). InnerVolumeSpecName "kube-api-access-7xv46". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.333933 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8caaee6f-9680-4e4d-97f0-81626914304a" (UID: "8caaee6f-9680-4e4d-97f0-81626914304a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.409433 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-catalog-content\") pod \"c3208263-39dd-4164-a165-85ef3dc127e8\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.409584 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-utilities\") pod \"c3208263-39dd-4164-a165-85ef3dc127e8\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.409681 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65qjz\" (UniqueName: \"kubernetes.io/projected/c3208263-39dd-4164-a165-85ef3dc127e8-kube-api-access-65qjz\") pod \"c3208263-39dd-4164-a165-85ef3dc127e8\" (UID: \"c3208263-39dd-4164-a165-85ef3dc127e8\") " Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.410106 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xv46\" (UniqueName: \"kubernetes.io/projected/8caaee6f-9680-4e4d-97f0-81626914304a-kube-api-access-7xv46\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.410123 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.410137 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8caaee6f-9680-4e4d-97f0-81626914304a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.417647 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-utilities" (OuterVolumeSpecName: "utilities") pod "c3208263-39dd-4164-a165-85ef3dc127e8" (UID: "c3208263-39dd-4164-a165-85ef3dc127e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.425547 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3208263-39dd-4164-a165-85ef3dc127e8-kube-api-access-65qjz" (OuterVolumeSpecName: "kube-api-access-65qjz") pod "c3208263-39dd-4164-a165-85ef3dc127e8" (UID: "c3208263-39dd-4164-a165-85ef3dc127e8"). InnerVolumeSpecName "kube-api-access-65qjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.512514 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3208263-39dd-4164-a165-85ef3dc127e8" (UID: "c3208263-39dd-4164-a165-85ef3dc127e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.513554 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.513576 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3208263-39dd-4164-a165-85ef3dc127e8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.513587 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65qjz\" (UniqueName: \"kubernetes.io/projected/c3208263-39dd-4164-a165-85ef3dc127e8-kube-api-access-65qjz\") on node \"crc\" DevicePath \"\"" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.678267 4702 generic.go:334] "Generic (PLEG): container finished" podID="8caaee6f-9680-4e4d-97f0-81626914304a" containerID="e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139" exitCode=0 Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.678358 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n78qm" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.678350 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n78qm" event={"ID":"8caaee6f-9680-4e4d-97f0-81626914304a","Type":"ContainerDied","Data":"e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139"} Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.679075 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n78qm" event={"ID":"8caaee6f-9680-4e4d-97f0-81626914304a","Type":"ContainerDied","Data":"f6a402e1a74f56d270caff9e0dd092f80c36fbc87b2b1117ca6ca8d648fdd638"} Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.679097 4702 scope.go:117] "RemoveContainer" containerID="e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.689962 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-866zh" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.690003 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-866zh" event={"ID":"c3208263-39dd-4164-a165-85ef3dc127e8","Type":"ContainerDied","Data":"c8caacbf9276a150664dc059aaed75072e6bfd158b2c7cd3b51022c7c170fbe0"} Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.692428 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"e16af177-16b0-40f2-a044-28684d0a15e8","Type":"ContainerStarted","Data":"71e6b5e66d35db91d02e07ddc760f0ab4b79893d2bd7a7a0e11b6263f452baaf"} Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.732362 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n78qm"] Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.738667 4702 scope.go:117] "RemoveContainer" containerID="85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.741371 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n78qm"] Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.751079 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-866zh"] Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.756867 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-866zh"] Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.768158 4702 scope.go:117] "RemoveContainer" containerID="3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.794490 4702 scope.go:117] "RemoveContainer" containerID="e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139" Dec 03 11:18:42 crc kubenswrapper[4702]: E1203 11:18:42.799128 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139\": container with ID starting with e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139 not found: ID does not exist" containerID="e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.799177 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139"} err="failed to get container status \"e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139\": rpc error: code = NotFound desc = could not find container \"e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139\": container with ID starting with e28fe5af6cac8f6fb8442adf892fe5418dcecd9417b80e4c5adccda907f37139 not found: ID does not exist" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.799211 4702 scope.go:117] "RemoveContainer" containerID="85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71" Dec 03 11:18:42 crc kubenswrapper[4702]: E1203 11:18:42.799775 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71\": container with ID starting with 85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71 not found: ID does not exist" containerID="85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.799835 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71"} err="failed to get container status \"85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71\": rpc error: code = NotFound desc = could not find container \"85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71\": container with ID starting with 85d645e827c2a9c4f231312eae6a247f75bcb3b1da1d42ec32923877858d1e71 not found: ID does not exist" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.799873 4702 scope.go:117] "RemoveContainer" containerID="3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878" Dec 03 11:18:42 crc kubenswrapper[4702]: E1203 11:18:42.800210 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878\": container with ID starting with 3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878 not found: ID does not exist" containerID="3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.800240 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878"} err="failed to get container status \"3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878\": rpc error: code = NotFound desc = could not find container \"3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878\": container with ID starting with 3c1e54478165fa0a86f9f44f46e5bdb514614f7237341fcbe2315bc8b77ec878 not found: ID does not exist" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.800259 4702 scope.go:117] "RemoveContainer" containerID="98187aee9376936ee766abc77e26a736390e9ffa182c85a1dbb3d1978a84736f" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.824640 4702 scope.go:117] "RemoveContainer" containerID="b614270270d940d6d91b375c88647742ee6dd758ae71edae4f839f27f0f0e5c1" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.872214 4702 scope.go:117] "RemoveContainer" containerID="75246fb18cf38804902d15e6a6f4e3d601b80ffd06f40d2aae39d5213feb0be0" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.939666 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" path="/var/lib/kubelet/pods/8caaee6f-9680-4e4d-97f0-81626914304a/volumes" Dec 03 11:18:42 crc kubenswrapper[4702]: I1203 11:18:42.940383 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" path="/var/lib/kubelet/pods/c3208263-39dd-4164-a165-85ef3dc127e8/volumes" Dec 03 11:18:45 crc kubenswrapper[4702]: I1203 11:18:45.723083 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"e16af177-16b0-40f2-a044-28684d0a15e8","Type":"ContainerStarted","Data":"da4834ede5f3bbddee1c1b414cc4b597dc2c62fa33693014382613ecd43f2923"} Dec 03 11:18:45 crc kubenswrapper[4702]: I1203 11:18:45.745397 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=5.334897181 podStartE2EDuration="8.74537047s" podCreationTimestamp="2025-12-03 11:18:37 +0000 UTC" firstStartedPulling="2025-12-03 11:18:41.742140784 +0000 UTC m=+905.578069258" lastFinishedPulling="2025-12-03 11:18:45.152614083 +0000 UTC m=+908.988542547" observedRunningTime="2025-12-03 11:18:45.740820921 +0000 UTC m=+909.576749395" watchObservedRunningTime="2025-12-03 11:18:45.74537047 +0000 UTC m=+909.581298944" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.003960 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2"] Dec 03 11:18:50 crc kubenswrapper[4702]: E1203 11:18:50.006545 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="extract-content" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.006639 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="extract-content" Dec 03 11:18:50 crc kubenswrapper[4702]: E1203 11:18:50.006724 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="registry-server" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.006822 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="registry-server" Dec 03 11:18:50 crc kubenswrapper[4702]: E1203 11:18:50.006903 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" containerName="extract-utilities" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.006991 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" containerName="extract-utilities" Dec 03 11:18:50 crc kubenswrapper[4702]: E1203 11:18:50.007068 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" containerName="registry-server" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.007151 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" containerName="registry-server" Dec 03 11:18:50 crc kubenswrapper[4702]: E1203 11:18:50.007239 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="extract-utilities" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.007312 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="extract-utilities" Dec 03 11:18:50 crc kubenswrapper[4702]: E1203 11:18:50.007397 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" containerName="extract-content" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.007477 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" containerName="extract-content" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.007729 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3208263-39dd-4164-a165-85ef3dc127e8" containerName="registry-server" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.007877 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="8caaee6f-9680-4e4d-97f0-81626914304a" containerName="registry-server" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.008580 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.015436 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.015713 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.018185 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.018344 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-9z4h7" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.020779 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.040416 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27vqn\" (UniqueName: \"kubernetes.io/projected/508c1eef-dbbc-4c32-8d2e-dbb797c72461-kube-api-access-27vqn\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.040869 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.040987 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508c1eef-dbbc-4c32-8d2e-dbb797c72461-config\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.041120 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.041216 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.042020 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.142939 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27vqn\" (UniqueName: \"kubernetes.io/projected/508c1eef-dbbc-4c32-8d2e-dbb797c72461-kube-api-access-27vqn\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.143038 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.143076 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508c1eef-dbbc-4c32-8d2e-dbb797c72461-config\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.143133 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.143160 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.144587 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-ca-bundle\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.145465 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/508c1eef-dbbc-4c32-8d2e-dbb797c72461-config\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.150446 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-distributor-http\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.165011 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/508c1eef-dbbc-4c32-8d2e-dbb797c72461-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.169589 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27vqn\" (UniqueName: \"kubernetes.io/projected/508c1eef-dbbc-4c32-8d2e-dbb797c72461-kube-api-access-27vqn\") pod \"logging-loki-distributor-76cc67bf56-xrnp2\" (UID: \"508c1eef-dbbc-4c32-8d2e-dbb797c72461\") " pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.233733 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-bhqrp"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.234866 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.240615 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.241085 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.241847 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.263780 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-bhqrp"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.334487 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.345656 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nggr\" (UniqueName: \"kubernetes.io/projected/042cc406-7960-493a-a19a-cb5590f8ff1f-kube-api-access-9nggr\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.345732 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.345773 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042cc406-7960-493a-a19a-cb5590f8ff1f-config\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.345824 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.345851 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.346217 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.428913 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.438749 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.442286 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.442573 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.447470 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.447527 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nggr\" (UniqueName: \"kubernetes.io/projected/042cc406-7960-493a-a19a-cb5590f8ff1f-kube-api-access-9nggr\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.447564 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.447591 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042cc406-7960-493a-a19a-cb5590f8ff1f-config\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.447629 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.447644 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.449512 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/042cc406-7960-493a-a19a-cb5590f8ff1f-config\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.450150 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-ca-bundle\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.459580 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-querier-grpc\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.464078 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-s3\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.464803 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/042cc406-7960-493a-a19a-cb5590f8ff1f-logging-loki-querier-http\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.473149 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.478253 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nggr\" (UniqueName: \"kubernetes.io/projected/042cc406-7960-493a-a19a-cb5590f8ff1f-kube-api-access-9nggr\") pod \"logging-loki-querier-5895d59bb8-bhqrp\" (UID: \"042cc406-7960-493a-a19a-cb5590f8ff1f\") " pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.549629 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-config\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.549712 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.549739 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.549796 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.549842 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwxs5\" (UniqueName: \"kubernetes.io/projected/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-kube-api-access-nwxs5\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.573483 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.636376 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-76dff8487c-68ktn"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.637642 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.644985 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.645219 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.645394 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.645540 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.645674 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.648656 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-76dff8487c-mdlcz"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.650215 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.651938 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.651988 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.652062 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.652122 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwxs5\" (UniqueName: \"kubernetes.io/projected/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-kube-api-access-nwxs5\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.652184 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-config\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.655361 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-config\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.656249 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.656781 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-rmlzs" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.658836 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-76dff8487c-68ktn"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.660837 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.666294 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-76dff8487c-mdlcz"] Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.681170 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.703364 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwxs5\" (UniqueName: \"kubernetes.io/projected/dbea18cb-2e45-4d86-bc00-17a82f0a78ff-kube-api-access-nwxs5\") pod \"logging-loki-query-frontend-84558f7c9f-dflgw\" (UID: \"dbea18cb-2e45-4d86-bc00-17a82f0a78ff\") " pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.754489 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-tls-secret\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.754545 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.754608 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.754666 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg7bz\" (UniqueName: \"kubernetes.io/projected/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-kube-api-access-wg7bz\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.754711 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-rbac\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.756964 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-tenants\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757099 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-lokistack-gateway\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757131 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-tls-secret\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757152 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvlwj\" (UniqueName: \"kubernetes.io/projected/6cbbba51-9166-42cb-917c-7c634351e5c9-kube-api-access-wvlwj\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757199 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-lokistack-gateway\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757324 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-tenants\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757371 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757404 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757461 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757527 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-rbac\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.757590 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.787829 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859559 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859616 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-rbac\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859649 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859691 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-tls-secret\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859714 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859741 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859791 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg7bz\" (UniqueName: \"kubernetes.io/projected/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-kube-api-access-wg7bz\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859820 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-rbac\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859842 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-tenants\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859860 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-lokistack-gateway\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859881 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-tls-secret\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859903 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvlwj\" (UniqueName: \"kubernetes.io/projected/6cbbba51-9166-42cb-917c-7c634351e5c9-kube-api-access-wvlwj\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859924 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-lokistack-gateway\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.859950 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-tenants\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.860910 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.860944 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.861257 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-rbac\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.861423 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.863208 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.863201 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.863314 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-lokistack-gateway\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.863473 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-rbac\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.863852 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.864006 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-lokistack-gateway\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.867893 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-tls-secret\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.869423 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-tls-secret\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.869570 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.869814 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/6cbbba51-9166-42cb-917c-7c634351e5c9-tenants\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.870117 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-tenants\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.870360 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.880518 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg7bz\" (UniqueName: \"kubernetes.io/projected/72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653-kube-api-access-wg7bz\") pod \"logging-loki-gateway-76dff8487c-mdlcz\" (UID: \"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.881094 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvlwj\" (UniqueName: \"kubernetes.io/projected/6cbbba51-9166-42cb-917c-7c634351e5c9-kube-api-access-wvlwj\") pod \"logging-loki-gateway-76dff8487c-68ktn\" (UID: \"6cbbba51-9166-42cb-917c-7c634351e5c9\") " pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:50 crc kubenswrapper[4702]: I1203 11:18:50.996628 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.015690 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.082298 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.195990 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.206635 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.213360 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.213419 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.271603 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6be39e5b-1984-4d45-b37c-8c00135e0981\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6be39e5b-1984-4d45-b37c-8c00135e0981\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.373829 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6be39e5b-1984-4d45-b37c-8c00135e0981\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6be39e5b-1984-4d45-b37c-8c00135e0981\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.373938 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ce47478-68cc-46a9-99c3-cb20947e63c5-config\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.373979 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b777e56f-9296-4121-94d4-76d301c34199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b777e56f-9296-4121-94d4-76d301c34199\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.374015 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.374037 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn9tk\" (UniqueName: \"kubernetes.io/projected/6ce47478-68cc-46a9-99c3-cb20947e63c5-kube-api-access-kn9tk\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.374100 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.374281 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.374390 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.377230 4702 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.377267 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6be39e5b-1984-4d45-b37c-8c00135e0981\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6be39e5b-1984-4d45-b37c-8c00135e0981\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4d43f8e24616ac089c628b251aace4e961cf308a3867f5bb7d6ade3a828063bc/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.406629 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.407717 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.411191 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.411481 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.578486 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ce47478-68cc-46a9-99c3-cb20947e63c5-config\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.578610 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b777e56f-9296-4121-94d4-76d301c34199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b777e56f-9296-4121-94d4-76d301c34199\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.580308 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.580379 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn9tk\" (UniqueName: \"kubernetes.io/projected/6ce47478-68cc-46a9-99c3-cb20947e63c5-kube-api-access-kn9tk\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.580510 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.580559 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.580621 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.603338 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.689593 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6be39e5b-1984-4d45-b37c-8c00135e0981\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6be39e5b-1984-4d45-b37c-8c00135e0981\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.694396 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.694478 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.695890 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/6ce47478-68cc-46a9-99c3-cb20947e63c5-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.701263 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.701735 4702 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.701814 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b777e56f-9296-4121-94d4-76d301c34199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b777e56f-9296-4121-94d4-76d301c34199\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c570fd058f06d16df0082892bda38a238e20d86090d00462b8c8eb052cb24cec/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.702688 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.710036 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.710294 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.715026 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.718935 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ce47478-68cc-46a9-99c3-cb20947e63c5-config\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.719017 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn9tk\" (UniqueName: \"kubernetes.io/projected/6ce47478-68cc-46a9-99c3-cb20947e63c5-kube-api-access-kn9tk\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.739226 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.761800 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.766985 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" event={"ID":"508c1eef-dbbc-4c32-8d2e-dbb797c72461","Type":"ContainerStarted","Data":"3df4777da9b2caaed62cdb9761ee76f1ea0bd4e72f04d7edc7c8df55bd0e4301"} Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.768712 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-5895d59bb8-bhqrp"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.779398 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b777e56f-9296-4121-94d4-76d301c34199\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b777e56f-9296-4121-94d4-76d301c34199\") pod \"logging-loki-ingester-0\" (UID: \"6ce47478-68cc-46a9-99c3-cb20947e63c5\") " pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.782813 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw"] Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.793627 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.793684 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358cd791-ccf7-4655-b446-b800598e773c-config\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.793744 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.793806 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.793826 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.793961 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9d075b2c-560b-4726-840e-6d4d33ec8e61\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d075b2c-560b-4726-840e-6d4d33ec8e61\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.794174 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz5b9\" (UniqueName: \"kubernetes.io/projected/358cd791-ccf7-4655-b446-b800598e773c-kube-api-access-pz5b9\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.843082 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.895790 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.895860 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.895891 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.895940 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.895964 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.895988 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e2709a-6aac-4b21-8fc8-bfc21992aae3-config\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.896022 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8c4954da-5463-4989-8b3e-a9d6788b73e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c4954da-5463-4989-8b3e-a9d6788b73e2\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.896045 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz5b9\" (UniqueName: \"kubernetes.io/projected/358cd791-ccf7-4655-b446-b800598e773c-kube-api-access-pz5b9\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.896083 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.896128 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.896158 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9d075b2c-560b-4726-840e-6d4d33ec8e61\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d075b2c-560b-4726-840e-6d4d33ec8e61\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.896204 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54ln4\" (UniqueName: \"kubernetes.io/projected/07e2709a-6aac-4b21-8fc8-bfc21992aae3-kube-api-access-54ln4\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.896233 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.896252 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358cd791-ccf7-4655-b446-b800598e773c-config\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.897119 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.897807 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/358cd791-ccf7-4655-b446-b800598e773c-config\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.901480 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.901618 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.902022 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/358cd791-ccf7-4655-b446-b800598e773c-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.904473 4702 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.904546 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9d075b2c-560b-4726-840e-6d4d33ec8e61\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d075b2c-560b-4726-840e-6d4d33ec8e61\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/81d1b54374536cee1d3af1ac7c2369eb08d18b2a531077fb8ecf33b3deb5ea9c/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.919259 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz5b9\" (UniqueName: \"kubernetes.io/projected/358cd791-ccf7-4655-b446-b800598e773c-kube-api-access-pz5b9\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.935346 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9d075b2c-560b-4726-840e-6d4d33ec8e61\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d075b2c-560b-4726-840e-6d4d33ec8e61\") pod \"logging-loki-compactor-0\" (UID: \"358cd791-ccf7-4655-b446-b800598e773c\") " pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.997610 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.997663 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.997687 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e2709a-6aac-4b21-8fc8-bfc21992aae3-config\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.997717 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8c4954da-5463-4989-8b3e-a9d6788b73e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c4954da-5463-4989-8b3e-a9d6788b73e2\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.997770 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.997932 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54ln4\" (UniqueName: \"kubernetes.io/projected/07e2709a-6aac-4b21-8fc8-bfc21992aae3-kube-api-access-54ln4\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:51 crc kubenswrapper[4702]: I1203 11:18:51.997966 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.000170 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07e2709a-6aac-4b21-8fc8-bfc21992aae3-config\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.000192 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.003895 4702 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.003943 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8c4954da-5463-4989-8b3e-a9d6788b73e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c4954da-5463-4989-8b3e-a9d6788b73e2\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/24adcdce53a16fdfa8ce903f345f89cb044f518368ac6d99cd864a76196a50f5/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.008249 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.008814 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.018594 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/07e2709a-6aac-4b21-8fc8-bfc21992aae3-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.021725 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54ln4\" (UniqueName: \"kubernetes.io/projected/07e2709a-6aac-4b21-8fc8-bfc21992aae3-kube-api-access-54ln4\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.040909 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8c4954da-5463-4989-8b3e-a9d6788b73e2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c4954da-5463-4989-8b3e-a9d6788b73e2\") pod \"logging-loki-index-gateway-0\" (UID: \"07e2709a-6aac-4b21-8fc8-bfc21992aae3\") " pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.069001 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.084368 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.094451 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-76dff8487c-mdlcz"] Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.156437 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-76dff8487c-68ktn"] Dec 03 11:18:52 crc kubenswrapper[4702]: W1203 11:18:52.171938 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cbbba51_9166_42cb_917c_7c634351e5c9.slice/crio-826852f17fc7a3654273609abd13623ff5b6a5406065378f0d8b3e601cbc6904 WatchSource:0}: Error finding container 826852f17fc7a3654273609abd13623ff5b6a5406065378f0d8b3e601cbc6904: Status 404 returned error can't find the container with id 826852f17fc7a3654273609abd13623ff5b6a5406065378f0d8b3e601cbc6904 Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.455901 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.533933 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Dec 03 11:18:52 crc kubenswrapper[4702]: W1203 11:18:52.536029 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod358cd791_ccf7_4655_b446_b800598e773c.slice/crio-958f4e2db34053c50368ae73684ea7a19e554452f52e07e013c351ac0e7207b5 WatchSource:0}: Error finding container 958f4e2db34053c50368ae73684ea7a19e554452f52e07e013c351ac0e7207b5: Status 404 returned error can't find the container with id 958f4e2db34053c50368ae73684ea7a19e554452f52e07e013c351ac0e7207b5 Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.781463 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"358cd791-ccf7-4655-b446-b800598e773c","Type":"ContainerStarted","Data":"958f4e2db34053c50368ae73684ea7a19e554452f52e07e013c351ac0e7207b5"} Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.782443 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" event={"ID":"6cbbba51-9166-42cb-917c-7c634351e5c9","Type":"ContainerStarted","Data":"826852f17fc7a3654273609abd13623ff5b6a5406065378f0d8b3e601cbc6904"} Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.783933 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"6ce47478-68cc-46a9-99c3-cb20947e63c5","Type":"ContainerStarted","Data":"49ebce58dd7a959b01de28f8d17890b903e2819b2b31782a6fb9bd1adf2f6d10"} Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.785549 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" event={"ID":"042cc406-7960-493a-a19a-cb5590f8ff1f","Type":"ContainerStarted","Data":"05ba315e1b7d5162ef547a1047bfd692bdc0f873237af0139603093e164371f1"} Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.786786 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" event={"ID":"dbea18cb-2e45-4d86-bc00-17a82f0a78ff","Type":"ContainerStarted","Data":"96d0e35193aaeca2eebb0f77348a8687026f370074abcaf1b9ea8ebf4f45e61c"} Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.787690 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" event={"ID":"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653","Type":"ContainerStarted","Data":"b9a3179bd366103dd6ca67b5055d55ee3c1337d8b538977361aa935b5cb63ffd"} Dec 03 11:18:52 crc kubenswrapper[4702]: I1203 11:18:52.874611 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Dec 03 11:18:52 crc kubenswrapper[4702]: W1203 11:18:52.888125 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07e2709a_6aac_4b21_8fc8_bfc21992aae3.slice/crio-28a91a31ccb6654bda12022fed5da84c5cce6ec09cb448deca72df6ce315bcc8 WatchSource:0}: Error finding container 28a91a31ccb6654bda12022fed5da84c5cce6ec09cb448deca72df6ce315bcc8: Status 404 returned error can't find the container with id 28a91a31ccb6654bda12022fed5da84c5cce6ec09cb448deca72df6ce315bcc8 Dec 03 11:18:53 crc kubenswrapper[4702]: I1203 11:18:53.814633 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"07e2709a-6aac-4b21-8fc8-bfc21992aae3","Type":"ContainerStarted","Data":"28a91a31ccb6654bda12022fed5da84c5cce6ec09cb448deca72df6ce315bcc8"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.846121 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" event={"ID":"dbea18cb-2e45-4d86-bc00-17a82f0a78ff","Type":"ContainerStarted","Data":"fd4f4cad9da9d52fbe1d769601a2af82f1d9072aaf6dc7a5dbcc546c9285b8b1"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.847879 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.848109 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" event={"ID":"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653","Type":"ContainerStarted","Data":"f691e826f1ed8a51b6090cd9d0c99f510776ffcd4bb9689ac66f789a581de89b"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.850291 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" event={"ID":"508c1eef-dbbc-4c32-8d2e-dbb797c72461","Type":"ContainerStarted","Data":"6c899465f3d11e8704dbbfc54d42d12d5a34b402fec39e85a548863aefff17ec"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.850479 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.852196 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"07e2709a-6aac-4b21-8fc8-bfc21992aae3","Type":"ContainerStarted","Data":"ec309cfd03c57333551cd107beb4e19c7ac80b8164e51fabe508b64697608507"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.852529 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.854091 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"358cd791-ccf7-4655-b446-b800598e773c","Type":"ContainerStarted","Data":"0c4fcf2840e2a8a85fb0a796a2e485239df3b3ef75786a5f05c853a1772d812f"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.854167 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.855522 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" event={"ID":"6cbbba51-9166-42cb-917c-7c634351e5c9","Type":"ContainerStarted","Data":"f7b5ea10c892641a73f0ad8a10c3b4f879ca82f50441d8982f5ddf1d7896b930"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.857345 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"6ce47478-68cc-46a9-99c3-cb20947e63c5","Type":"ContainerStarted","Data":"a72f9aa0860761c9a65277168dfdd3327ce28c8dc30857800332a0d68e927b8c"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.858071 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.859571 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" event={"ID":"042cc406-7960-493a-a19a-cb5590f8ff1f","Type":"ContainerStarted","Data":"bce39206f49d00a80783ea1af69c87d9962f16fc3fcf70574b2cf4f363138b74"} Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.859810 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.871909 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" podStartSLOduration=3.012570946 podStartE2EDuration="7.871886785s" podCreationTimestamp="2025-12-03 11:18:50 +0000 UTC" firstStartedPulling="2025-12-03 11:18:51.768484992 +0000 UTC m=+915.604413456" lastFinishedPulling="2025-12-03 11:18:56.627800831 +0000 UTC m=+920.463729295" observedRunningTime="2025-12-03 11:18:57.870412714 +0000 UTC m=+921.706341188" watchObservedRunningTime="2025-12-03 11:18:57.871886785 +0000 UTC m=+921.707815249" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.911144 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podStartSLOduration=3.088604119 podStartE2EDuration="7.911120137s" podCreationTimestamp="2025-12-03 11:18:50 +0000 UTC" firstStartedPulling="2025-12-03 11:18:51.76842784 +0000 UTC m=+915.604356304" lastFinishedPulling="2025-12-03 11:18:56.590943858 +0000 UTC m=+920.426872322" observedRunningTime="2025-12-03 11:18:57.905670922 +0000 UTC m=+921.741599386" watchObservedRunningTime="2025-12-03 11:18:57.911120137 +0000 UTC m=+921.747048601" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.933994 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=3.794379627 podStartE2EDuration="7.933971984s" podCreationTimestamp="2025-12-03 11:18:50 +0000 UTC" firstStartedPulling="2025-12-03 11:18:52.488310957 +0000 UTC m=+916.324239421" lastFinishedPulling="2025-12-03 11:18:56.627903314 +0000 UTC m=+920.463831778" observedRunningTime="2025-12-03 11:18:57.931426112 +0000 UTC m=+921.767354576" watchObservedRunningTime="2025-12-03 11:18:57.933971984 +0000 UTC m=+921.769900448" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.957373 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.857393652 podStartE2EDuration="7.957349866s" podCreationTimestamp="2025-12-03 11:18:50 +0000 UTC" firstStartedPulling="2025-12-03 11:18:52.538398206 +0000 UTC m=+916.374326670" lastFinishedPulling="2025-12-03 11:18:56.63835442 +0000 UTC m=+920.474282884" observedRunningTime="2025-12-03 11:18:57.95219533 +0000 UTC m=+921.788123824" watchObservedRunningTime="2025-12-03 11:18:57.957349866 +0000 UTC m=+921.793278330" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.975195 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=4.225684843 podStartE2EDuration="7.975171181s" podCreationTimestamp="2025-12-03 11:18:50 +0000 UTC" firstStartedPulling="2025-12-03 11:18:52.890108347 +0000 UTC m=+916.726036811" lastFinishedPulling="2025-12-03 11:18:56.639594685 +0000 UTC m=+920.475523149" observedRunningTime="2025-12-03 11:18:57.967815512 +0000 UTC m=+921.803743976" watchObservedRunningTime="2025-12-03 11:18:57.975171181 +0000 UTC m=+921.811099645" Dec 03 11:18:57 crc kubenswrapper[4702]: I1203 11:18:57.999154 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" podStartSLOduration=3.470459373 podStartE2EDuration="8.999127279s" podCreationTimestamp="2025-12-03 11:18:49 +0000 UTC" firstStartedPulling="2025-12-03 11:18:51.105445414 +0000 UTC m=+914.941373878" lastFinishedPulling="2025-12-03 11:18:56.63411332 +0000 UTC m=+920.470041784" observedRunningTime="2025-12-03 11:18:57.986414889 +0000 UTC m=+921.822343373" watchObservedRunningTime="2025-12-03 11:18:57.999127279 +0000 UTC m=+921.835055743" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.889241 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" event={"ID":"72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653","Type":"ContainerStarted","Data":"16a95a7503071b01b66661ba4ab5c187641574d20537c7afbee31dbf42adc44c"} Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.889997 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.890018 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.892112 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" event={"ID":"6cbbba51-9166-42cb-917c-7c634351e5c9","Type":"ContainerStarted","Data":"44997ba0add3a8d9abf18d1ad6aee9e34b6453ddc893781a913608c55b53d01a"} Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.892395 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.900010 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.901474 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.902203 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.915802 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podStartSLOduration=3.307891749 podStartE2EDuration="10.915778091s" podCreationTimestamp="2025-12-03 11:18:50 +0000 UTC" firstStartedPulling="2025-12-03 11:18:52.105486116 +0000 UTC m=+915.941414580" lastFinishedPulling="2025-12-03 11:18:59.713372458 +0000 UTC m=+923.549300922" observedRunningTime="2025-12-03 11:19:00.914255348 +0000 UTC m=+924.750183812" watchObservedRunningTime="2025-12-03 11:19:00.915778091 +0000 UTC m=+924.751706555" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.957176 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podStartSLOduration=3.427517818 podStartE2EDuration="10.957148293s" podCreationTimestamp="2025-12-03 11:18:50 +0000 UTC" firstStartedPulling="2025-12-03 11:18:52.188690293 +0000 UTC m=+916.024618757" lastFinishedPulling="2025-12-03 11:18:59.718320768 +0000 UTC m=+923.554249232" observedRunningTime="2025-12-03 11:19:00.948898569 +0000 UTC m=+924.784827033" watchObservedRunningTime="2025-12-03 11:19:00.957148293 +0000 UTC m=+924.793076757" Dec 03 11:19:00 crc kubenswrapper[4702]: I1203 11:19:00.998425 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:19:01 crc kubenswrapper[4702]: I1203 11:19:01.028181 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" Dec 03 11:19:11 crc kubenswrapper[4702]: I1203 11:19:11.853013 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 03 11:19:11 crc kubenswrapper[4702]: I1203 11:19:11.853671 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 11:19:12 crc kubenswrapper[4702]: I1203 11:19:12.076165 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 03 11:19:12 crc kubenswrapper[4702]: I1203 11:19:12.090849 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 11:19:20 crc kubenswrapper[4702]: I1203 11:19:20.343114 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 11:19:20 crc kubenswrapper[4702]: I1203 11:19:20.582558 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 11:19:20 crc kubenswrapper[4702]: I1203 11:19:20.853929 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 11:19:21 crc kubenswrapper[4702]: I1203 11:19:21.848494 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Dec 03 11:19:21 crc kubenswrapper[4702]: I1203 11:19:21.848928 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 11:19:31 crc kubenswrapper[4702]: I1203 11:19:31.853184 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 03 11:19:31 crc kubenswrapper[4702]: I1203 11:19:31.854115 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 11:19:41 crc kubenswrapper[4702]: I1203 11:19:41.848545 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Dec 03 11:19:41 crc kubenswrapper[4702]: I1203 11:19:41.850340 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 11:19:51 crc kubenswrapper[4702]: I1203 11:19:51.848828 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.743992 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-swqcd"] Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.745879 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.748057 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.748720 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.748941 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.749193 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.751459 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-56sjd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.761275 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-swqcd"] Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.765014 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809242 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809332 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xj4j\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-kube-api-access-9xj4j\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809364 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-token\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809449 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-sa-token\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809495 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/aad522f1-299e-4329-98e7-a2cc664478a8-datadir\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809569 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-trusted-ca\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809601 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aad522f1-299e-4329-98e7-a2cc664478a8-tmp\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809655 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-entrypoint\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809681 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-syslog-receiver\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809731 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config-openshift-service-cacrt\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.809779 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911405 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aad522f1-299e-4329-98e7-a2cc664478a8-tmp\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911509 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-entrypoint\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911547 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-syslog-receiver\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911588 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config-openshift-service-cacrt\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911621 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911647 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911696 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xj4j\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-kube-api-access-9xj4j\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911724 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-token\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911877 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-sa-token\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911908 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/aad522f1-299e-4329-98e7-a2cc664478a8-datadir\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.911981 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-trusted-ca\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: E1203 11:20:10.913132 4702 secret.go:188] Couldn't get secret openshift-logging/collector-metrics: secret "collector-metrics" not found Dec 03 11:20:10 crc kubenswrapper[4702]: E1203 11:20:10.913207 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics podName:aad522f1-299e-4329-98e7-a2cc664478a8 nodeName:}" failed. No retries permitted until 2025-12-03 11:20:11.413179242 +0000 UTC m=+995.249107706 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics" (UniqueName: "kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics") pod "collector-swqcd" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8") : secret "collector-metrics" not found Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.913790 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-trusted-ca\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.914013 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-entrypoint\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.915240 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config-openshift-service-cacrt\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.915480 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/aad522f1-299e-4329-98e7-a2cc664478a8-datadir\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.916248 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.921301 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-token\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.921332 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-syslog-receiver\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.928163 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aad522f1-299e-4329-98e7-a2cc664478a8-tmp\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.941611 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-sa-token\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.944475 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xj4j\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-kube-api-access-9xj4j\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:10 crc kubenswrapper[4702]: I1203 11:20:10.962255 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-swqcd"] Dec 03 11:20:10 crc kubenswrapper[4702]: E1203 11:20:10.962973 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-swqcd" podUID="aad522f1-299e-4329-98e7-a2cc664478a8" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.420919 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.429482 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics\") pod \"collector-swqcd\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " pod="openshift-logging/collector-swqcd" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.476397 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-swqcd" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.488145 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-swqcd" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.624538 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xj4j\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-kube-api-access-9xj4j\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.624948 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-entrypoint\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625132 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aad522f1-299e-4329-98e7-a2cc664478a8-tmp\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625263 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/aad522f1-299e-4329-98e7-a2cc664478a8-datadir\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625374 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-sa-token\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625457 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aad522f1-299e-4329-98e7-a2cc664478a8-datadir" (OuterVolumeSpecName: "datadir") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625505 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625623 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625738 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-token\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625867 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-trusted-ca\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.625982 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config-openshift-service-cacrt\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.626098 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.626243 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-syslog-receiver\") pod \"aad522f1-299e-4329-98e7-a2cc664478a8\" (UID: \"aad522f1-299e-4329-98e7-a2cc664478a8\") " Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.626635 4702 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/aad522f1-299e-4329-98e7-a2cc664478a8-datadir\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.626736 4702 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-entrypoint\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.629109 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.629451 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.629744 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config" (OuterVolumeSpecName: "config") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.632867 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-sa-token" (OuterVolumeSpecName: "sa-token") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.634873 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-token" (OuterVolumeSpecName: "collector-token") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.634905 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics" (OuterVolumeSpecName: "metrics") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.634954 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aad522f1-299e-4329-98e7-a2cc664478a8-tmp" (OuterVolumeSpecName: "tmp") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.634995 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-kube-api-access-9xj4j" (OuterVolumeSpecName: "kube-api-access-9xj4j") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "kube-api-access-9xj4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.635229 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "aad522f1-299e-4329-98e7-a2cc664478a8" (UID: "aad522f1-299e-4329-98e7-a2cc664478a8"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728058 4702 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728096 4702 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728105 4702 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-token\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728115 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728126 4702 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728137 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aad522f1-299e-4329-98e7-a2cc664478a8-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728146 4702 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/aad522f1-299e-4329-98e7-a2cc664478a8-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728156 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xj4j\" (UniqueName: \"kubernetes.io/projected/aad522f1-299e-4329-98e7-a2cc664478a8-kube-api-access-9xj4j\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:11 crc kubenswrapper[4702]: I1203 11:20:11.728165 4702 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aad522f1-299e-4329-98e7-a2cc664478a8-tmp\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.483710 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-swqcd" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.535489 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-swqcd"] Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.541075 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-swqcd"] Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.554789 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-nlphm"] Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.556095 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.557827 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-56sjd" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.558575 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.559307 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.560494 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.561950 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.567688 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.581068 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-nlphm"] Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.742771 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-trusted-ca\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.742823 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-metrics\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.742916 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-config\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.742958 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/cc184786-2931-4cb8-a185-b4f1fb2bcb40-sa-token\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.742982 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-collector-token\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.742999 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/cc184786-2931-4cb8-a185-b4f1fb2bcb40-datadir\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.743062 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-entrypoint\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.743182 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z6tg\" (UniqueName: \"kubernetes.io/projected/cc184786-2931-4cb8-a185-b4f1fb2bcb40-kube-api-access-6z6tg\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.743208 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-collector-syslog-receiver\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.743253 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-config-openshift-service-cacrt\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.743273 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc184786-2931-4cb8-a185-b4f1fb2bcb40-tmp\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.844691 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-entrypoint\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.844825 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z6tg\" (UniqueName: \"kubernetes.io/projected/cc184786-2931-4cb8-a185-b4f1fb2bcb40-kube-api-access-6z6tg\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.844855 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-collector-syslog-receiver\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.844895 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-config-openshift-service-cacrt\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.844919 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc184786-2931-4cb8-a185-b4f1fb2bcb40-tmp\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.844973 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-trusted-ca\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.844994 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-metrics\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.845027 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-config\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.845049 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/cc184786-2931-4cb8-a185-b4f1fb2bcb40-sa-token\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.845072 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-collector-token\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.845098 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/cc184786-2931-4cb8-a185-b4f1fb2bcb40-datadir\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.845176 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/cc184786-2931-4cb8-a185-b4f1fb2bcb40-datadir\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.845737 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-entrypoint\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.845788 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-config-openshift-service-cacrt\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.846404 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-config\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.846565 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc184786-2931-4cb8-a185-b4f1fb2bcb40-trusted-ca\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.848672 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-collector-syslog-receiver\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.848880 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-collector-token\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.849746 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/cc184786-2931-4cb8-a185-b4f1fb2bcb40-metrics\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.860209 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc184786-2931-4cb8-a185-b4f1fb2bcb40-tmp\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.863445 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/cc184786-2931-4cb8-a185-b4f1fb2bcb40-sa-token\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.863642 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z6tg\" (UniqueName: \"kubernetes.io/projected/cc184786-2931-4cb8-a185-b4f1fb2bcb40-kube-api-access-6z6tg\") pod \"collector-nlphm\" (UID: \"cc184786-2931-4cb8-a185-b4f1fb2bcb40\") " pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.876871 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-nlphm" Dec 03 11:20:12 crc kubenswrapper[4702]: I1203 11:20:12.944650 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aad522f1-299e-4329-98e7-a2cc664478a8" path="/var/lib/kubelet/pods/aad522f1-299e-4329-98e7-a2cc664478a8/volumes" Dec 03 11:20:13 crc kubenswrapper[4702]: I1203 11:20:13.449827 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-nlphm"] Dec 03 11:20:13 crc kubenswrapper[4702]: I1203 11:20:13.494046 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-nlphm" event={"ID":"cc184786-2931-4cb8-a185-b4f1fb2bcb40","Type":"ContainerStarted","Data":"dd9039fe633afedcfb0505fe2540321c8c3eab28c7f2676e7d94ff8a3a78ee44"} Dec 03 11:20:24 crc kubenswrapper[4702]: I1203 11:20:24.669937 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-nlphm" event={"ID":"cc184786-2931-4cb8-a185-b4f1fb2bcb40","Type":"ContainerStarted","Data":"12baff095f5aeb44c0666d3e5eb6d3defc50389c35667671f11de966e76cd851"} Dec 03 11:20:24 crc kubenswrapper[4702]: I1203 11:20:24.692489 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-nlphm" podStartSLOduration=1.901114224 podStartE2EDuration="12.692462515s" podCreationTimestamp="2025-12-03 11:20:12 +0000 UTC" firstStartedPulling="2025-12-03 11:20:13.463073244 +0000 UTC m=+997.299001708" lastFinishedPulling="2025-12-03 11:20:24.254421535 +0000 UTC m=+1008.090349999" observedRunningTime="2025-12-03 11:20:24.68873724 +0000 UTC m=+1008.524665724" watchObservedRunningTime="2025-12-03 11:20:24.692462515 +0000 UTC m=+1008.528390979" Dec 03 11:20:25 crc kubenswrapper[4702]: I1203 11:20:25.908781 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:20:25 crc kubenswrapper[4702]: I1203 11:20:25.908878 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.156178 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d"] Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.159045 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.163772 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.174217 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d"] Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.188074 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5cf\" (UniqueName: \"kubernetes.io/projected/2a038246-a8c1-4a5d-8ef4-250eaf126ace-kube-api-access-pw5cf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.188147 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.188215 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.289712 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.289930 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5cf\" (UniqueName: \"kubernetes.io/projected/2a038246-a8c1-4a5d-8ef4-250eaf126ace-kube-api-access-pw5cf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.289982 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.290361 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.290565 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.309614 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5cf\" (UniqueName: \"kubernetes.io/projected/2a038246-a8c1-4a5d-8ef4-250eaf126ace-kube-api-access-pw5cf\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.483350 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:51 crc kubenswrapper[4702]: I1203 11:20:51.964941 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d"] Dec 03 11:20:52 crc kubenswrapper[4702]: I1203 11:20:52.892518 4702 generic.go:334] "Generic (PLEG): container finished" podID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerID="3a687b455863371532462f481aa207b9da3623d96bcdfbd2b1428d9ba6e4dad2" exitCode=0 Dec 03 11:20:52 crc kubenswrapper[4702]: I1203 11:20:52.892604 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" event={"ID":"2a038246-a8c1-4a5d-8ef4-250eaf126ace","Type":"ContainerDied","Data":"3a687b455863371532462f481aa207b9da3623d96bcdfbd2b1428d9ba6e4dad2"} Dec 03 11:20:52 crc kubenswrapper[4702]: I1203 11:20:52.892943 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" event={"ID":"2a038246-a8c1-4a5d-8ef4-250eaf126ace","Type":"ContainerStarted","Data":"e9a8774ce4b148883a2e377aff3ed0df48df18f12b3056ab59c9508250861fc3"} Dec 03 11:20:52 crc kubenswrapper[4702]: I1203 11:20:52.894876 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:20:55 crc kubenswrapper[4702]: I1203 11:20:55.908264 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:20:55 crc kubenswrapper[4702]: I1203 11:20:55.908716 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:20:55 crc kubenswrapper[4702]: I1203 11:20:55.930523 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" event={"ID":"2a038246-a8c1-4a5d-8ef4-250eaf126ace","Type":"ContainerStarted","Data":"bd69486b6b4257b3e823c8948b76a23d04c77d057b58ed7bd858aa3752b0cc16"} Dec 03 11:20:56 crc kubenswrapper[4702]: I1203 11:20:56.938642 4702 generic.go:334] "Generic (PLEG): container finished" podID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerID="bd69486b6b4257b3e823c8948b76a23d04c77d057b58ed7bd858aa3752b0cc16" exitCode=0 Dec 03 11:20:56 crc kubenswrapper[4702]: I1203 11:20:56.938690 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" event={"ID":"2a038246-a8c1-4a5d-8ef4-250eaf126ace","Type":"ContainerDied","Data":"bd69486b6b4257b3e823c8948b76a23d04c77d057b58ed7bd858aa3752b0cc16"} Dec 03 11:20:57 crc kubenswrapper[4702]: I1203 11:20:57.948981 4702 generic.go:334] "Generic (PLEG): container finished" podID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerID="cfa17744a05bd248dad4c08ab83fab8a0aab6fd19a5fc01340f8d23224b50f7f" exitCode=0 Dec 03 11:20:57 crc kubenswrapper[4702]: I1203 11:20:57.949095 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" event={"ID":"2a038246-a8c1-4a5d-8ef4-250eaf126ace","Type":"ContainerDied","Data":"cfa17744a05bd248dad4c08ab83fab8a0aab6fd19a5fc01340f8d23224b50f7f"} Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.251329 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.341460 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-bundle\") pod \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.341558 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-util\") pod \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.341607 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5cf\" (UniqueName: \"kubernetes.io/projected/2a038246-a8c1-4a5d-8ef4-250eaf126ace-kube-api-access-pw5cf\") pod \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\" (UID: \"2a038246-a8c1-4a5d-8ef4-250eaf126ace\") " Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.342341 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-bundle" (OuterVolumeSpecName: "bundle") pod "2a038246-a8c1-4a5d-8ef4-250eaf126ace" (UID: "2a038246-a8c1-4a5d-8ef4-250eaf126ace"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.348356 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a038246-a8c1-4a5d-8ef4-250eaf126ace-kube-api-access-pw5cf" (OuterVolumeSpecName: "kube-api-access-pw5cf") pod "2a038246-a8c1-4a5d-8ef4-250eaf126ace" (UID: "2a038246-a8c1-4a5d-8ef4-250eaf126ace"). InnerVolumeSpecName "kube-api-access-pw5cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.352978 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-util" (OuterVolumeSpecName: "util") pod "2a038246-a8c1-4a5d-8ef4-250eaf126ace" (UID: "2a038246-a8c1-4a5d-8ef4-250eaf126ace"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.444170 4702 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.444208 4702 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2a038246-a8c1-4a5d-8ef4-250eaf126ace-util\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.444222 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw5cf\" (UniqueName: \"kubernetes.io/projected/2a038246-a8c1-4a5d-8ef4-250eaf126ace-kube-api-access-pw5cf\") on node \"crc\" DevicePath \"\"" Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.970031 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" event={"ID":"2a038246-a8c1-4a5d-8ef4-250eaf126ace","Type":"ContainerDied","Data":"e9a8774ce4b148883a2e377aff3ed0df48df18f12b3056ab59c9508250861fc3"} Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.970079 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9a8774ce4b148883a2e377aff3ed0df48df18f12b3056ab59c9508250861fc3" Dec 03 11:20:59 crc kubenswrapper[4702]: I1203 11:20:59.970130 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d" Dec 03 11:21:00 crc kubenswrapper[4702]: E1203 11:21:00.120326 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a038246_a8c1_4a5d_8ef4_250eaf126ace.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a038246_a8c1_4a5d_8ef4_250eaf126ace.slice/crio-e9a8774ce4b148883a2e377aff3ed0df48df18f12b3056ab59c9508250861fc3\": RecentStats: unable to find data in memory cache]" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.186141 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794"] Dec 03 11:21:03 crc kubenswrapper[4702]: E1203 11:21:03.187312 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerName="util" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.187327 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerName="util" Dec 03 11:21:03 crc kubenswrapper[4702]: E1203 11:21:03.187342 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerName="extract" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.187351 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerName="extract" Dec 03 11:21:03 crc kubenswrapper[4702]: E1203 11:21:03.187370 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerName="pull" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.187380 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerName="pull" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.187536 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a038246-a8c1-4a5d-8ef4-250eaf126ace" containerName="extract" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.188452 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.192262 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-xdkqt" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.192589 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.193138 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.195704 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794"] Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.325340 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xsh5\" (UniqueName: \"kubernetes.io/projected/d1a5b826-9b8e-4400-8e6e-06824af9bd4c-kube-api-access-9xsh5\") pod \"nmstate-operator-5b5b58f5c8-pc794\" (UID: \"d1a5b826-9b8e-4400-8e6e-06824af9bd4c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.429215 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xsh5\" (UniqueName: \"kubernetes.io/projected/d1a5b826-9b8e-4400-8e6e-06824af9bd4c-kube-api-access-9xsh5\") pod \"nmstate-operator-5b5b58f5c8-pc794\" (UID: \"d1a5b826-9b8e-4400-8e6e-06824af9bd4c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.465685 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xsh5\" (UniqueName: \"kubernetes.io/projected/d1a5b826-9b8e-4400-8e6e-06824af9bd4c-kube-api-access-9xsh5\") pod \"nmstate-operator-5b5b58f5c8-pc794\" (UID: \"d1a5b826-9b8e-4400-8e6e-06824af9bd4c\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794" Dec 03 11:21:03 crc kubenswrapper[4702]: I1203 11:21:03.523196 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794" Dec 03 11:21:04 crc kubenswrapper[4702]: I1203 11:21:04.016850 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794"] Dec 03 11:21:05 crc kubenswrapper[4702]: I1203 11:21:05.010026 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794" event={"ID":"d1a5b826-9b8e-4400-8e6e-06824af9bd4c","Type":"ContainerStarted","Data":"80fdfa5feb1fd12ec3aa051219029016f3c9610cd0f86992c5b744acb1a8f0e8"} Dec 03 11:21:07 crc kubenswrapper[4702]: I1203 11:21:07.030007 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794" event={"ID":"d1a5b826-9b8e-4400-8e6e-06824af9bd4c","Type":"ContainerStarted","Data":"e77b765281ead7e4c0b1b2c45a72b96226533ced6e18808d1d315d093bc94585"} Dec 03 11:21:07 crc kubenswrapper[4702]: I1203 11:21:07.057572 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-pc794" podStartSLOduration=1.6492947820000001 podStartE2EDuration="4.057539657s" podCreationTimestamp="2025-12-03 11:21:03 +0000 UTC" firstStartedPulling="2025-12-03 11:21:04.024719577 +0000 UTC m=+1047.860648081" lastFinishedPulling="2025-12-03 11:21:06.432964492 +0000 UTC m=+1050.268892956" observedRunningTime="2025-12-03 11:21:07.056790666 +0000 UTC m=+1050.892719150" watchObservedRunningTime="2025-12-03 11:21:07.057539657 +0000 UTC m=+1050.893468121" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.355997 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.358438 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.365167 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-vqxwl" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.365366 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.366545 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.369847 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.372468 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.379831 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-zxwqw"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.404421 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lw77\" (UniqueName: \"kubernetes.io/projected/cf40bd24-301e-4eb1-bbdb-84a55cd53cc9-kube-api-access-6lw77\") pod \"nmstate-metrics-7f946cbc9-6jgh5\" (UID: \"cf40bd24-301e-4eb1-bbdb-84a55cd53cc9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.404498 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9370f81f-7868-4a16-9cec-7786257cdcbd-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-r6jd6\" (UID: \"9370f81f-7868-4a16-9cec-7786257cdcbd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.404832 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twnt5\" (UniqueName: \"kubernetes.io/projected/9370f81f-7868-4a16-9cec-7786257cdcbd-kube-api-access-twnt5\") pod \"nmstate-webhook-5f6d4c5ccb-r6jd6\" (UID: \"9370f81f-7868-4a16-9cec-7786257cdcbd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.409713 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.409877 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.506990 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6htm\" (UniqueName: \"kubernetes.io/projected/9ec2138c-eb31-401f-b62d-d2823fe0523f-kube-api-access-h6htm\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.507048 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-ovs-socket\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.507079 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lw77\" (UniqueName: \"kubernetes.io/projected/cf40bd24-301e-4eb1-bbdb-84a55cd53cc9-kube-api-access-6lw77\") pod \"nmstate-metrics-7f946cbc9-6jgh5\" (UID: \"cf40bd24-301e-4eb1-bbdb-84a55cd53cc9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.507108 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9370f81f-7868-4a16-9cec-7786257cdcbd-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-r6jd6\" (UID: \"9370f81f-7868-4a16-9cec-7786257cdcbd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:12 crc kubenswrapper[4702]: E1203 11:21:12.507209 4702 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.507237 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-nmstate-lock\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: E1203 11:21:12.507258 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9370f81f-7868-4a16-9cec-7786257cdcbd-tls-key-pair podName:9370f81f-7868-4a16-9cec-7786257cdcbd nodeName:}" failed. No retries permitted until 2025-12-03 11:21:13.007239985 +0000 UTC m=+1056.843168449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/9370f81f-7868-4a16-9cec-7786257cdcbd-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-r6jd6" (UID: "9370f81f-7868-4a16-9cec-7786257cdcbd") : secret "openshift-nmstate-webhook" not found Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.507322 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-dbus-socket\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.507437 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twnt5\" (UniqueName: \"kubernetes.io/projected/9370f81f-7868-4a16-9cec-7786257cdcbd-kube-api-access-twnt5\") pod \"nmstate-webhook-5f6d4c5ccb-r6jd6\" (UID: \"9370f81f-7868-4a16-9cec-7786257cdcbd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.522553 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.524036 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.527189 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.527748 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-bw6hj" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.530944 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.553623 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.555158 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twnt5\" (UniqueName: \"kubernetes.io/projected/9370f81f-7868-4a16-9cec-7786257cdcbd-kube-api-access-twnt5\") pod \"nmstate-webhook-5f6d4c5ccb-r6jd6\" (UID: \"9370f81f-7868-4a16-9cec-7786257cdcbd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.557853 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lw77\" (UniqueName: \"kubernetes.io/projected/cf40bd24-301e-4eb1-bbdb-84a55cd53cc9-kube-api-access-6lw77\") pod \"nmstate-metrics-7f946cbc9-6jgh5\" (UID: \"cf40bd24-301e-4eb1-bbdb-84a55cd53cc9\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.608592 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n9ch\" (UniqueName: \"kubernetes.io/projected/908a9238-0a36-40c1-a7c0-c0c0789f29ae-kube-api-access-4n9ch\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.608649 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-nmstate-lock\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.608713 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-dbus-socket\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.608749 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/908a9238-0a36-40c1-a7c0-c0c0789f29ae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.608778 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-nmstate-lock\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.608793 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/908a9238-0a36-40c1-a7c0-c0c0789f29ae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.608997 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6htm\" (UniqueName: \"kubernetes.io/projected/9ec2138c-eb31-401f-b62d-d2823fe0523f-kube-api-access-h6htm\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.609052 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-ovs-socket\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.609064 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-dbus-socket\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.609319 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/9ec2138c-eb31-401f-b62d-d2823fe0523f-ovs-socket\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.631493 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6htm\" (UniqueName: \"kubernetes.io/projected/9ec2138c-eb31-401f-b62d-d2823fe0523f-kube-api-access-h6htm\") pod \"nmstate-handler-zxwqw\" (UID: \"9ec2138c-eb31-401f-b62d-d2823fe0523f\") " pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.710161 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/908a9238-0a36-40c1-a7c0-c0c0789f29ae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.710608 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/908a9238-0a36-40c1-a7c0-c0c0789f29ae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.710797 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n9ch\" (UniqueName: \"kubernetes.io/projected/908a9238-0a36-40c1-a7c0-c0c0789f29ae-kube-api-access-4n9ch\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: E1203 11:21:12.710813 4702 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 03 11:21:12 crc kubenswrapper[4702]: E1203 11:21:12.710907 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/908a9238-0a36-40c1-a7c0-c0c0789f29ae-plugin-serving-cert podName:908a9238-0a36-40c1-a7c0-c0c0789f29ae nodeName:}" failed. No retries permitted until 2025-12-03 11:21:13.210884471 +0000 UTC m=+1057.046812935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/908a9238-0a36-40c1-a7c0-c0c0789f29ae-plugin-serving-cert") pod "nmstate-console-plugin-7fbb5f6569-dmrtz" (UID: "908a9238-0a36-40c1-a7c0-c0c0789f29ae") : secret "plugin-serving-cert" not found Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.711384 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/908a9238-0a36-40c1-a7c0-c0c0789f29ae-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.711935 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.725240 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-769cf4984d-hvjnw"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.726499 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.755508 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.763888 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-769cf4984d-hvjnw"] Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.784523 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n9ch\" (UniqueName: \"kubernetes.io/projected/908a9238-0a36-40c1-a7c0-c0c0789f29ae-kube-api-access-4n9ch\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.811795 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-oauth-config\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.811872 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-service-ca\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.811948 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-trusted-ca-bundle\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.812006 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-oauth-serving-cert\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.812038 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk49h\" (UniqueName: \"kubernetes.io/projected/96a2e67f-2c70-4ac9-992f-b19afb498a1f-kube-api-access-tk49h\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.812086 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-serving-cert\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.812109 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-config\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.913943 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk49h\" (UniqueName: \"kubernetes.io/projected/96a2e67f-2c70-4ac9-992f-b19afb498a1f-kube-api-access-tk49h\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.914028 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-serving-cert\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.914053 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-config\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.914097 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-oauth-config\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.914148 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-service-ca\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.914203 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-trusted-ca-bundle\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.914236 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-oauth-serving-cert\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.915335 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-oauth-serving-cert\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.915575 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-config\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.916047 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-service-ca\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.916642 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-trusted-ca-bundle\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.920327 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-oauth-config\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.921434 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-serving-cert\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:12 crc kubenswrapper[4702]: I1203 11:21:12.936580 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk49h\" (UniqueName: \"kubernetes.io/projected/96a2e67f-2c70-4ac9-992f-b19afb498a1f-kube-api-access-tk49h\") pod \"console-769cf4984d-hvjnw\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.015695 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9370f81f-7868-4a16-9cec-7786257cdcbd-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-r6jd6\" (UID: \"9370f81f-7868-4a16-9cec-7786257cdcbd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.023608 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9370f81f-7868-4a16-9cec-7786257cdcbd-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-r6jd6\" (UID: \"9370f81f-7868-4a16-9cec-7786257cdcbd\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.025381 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.076071 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zxwqw" event={"ID":"9ec2138c-eb31-401f-b62d-d2823fe0523f","Type":"ContainerStarted","Data":"76890a1baeb7b735a7607fe0f6f7f8c667abe3fd6bf002043843de5b81230e48"} Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.105233 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.219313 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/908a9238-0a36-40c1-a7c0-c0c0789f29ae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.223492 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/908a9238-0a36-40c1-a7c0-c0c0789f29ae-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-dmrtz\" (UID: \"908a9238-0a36-40c1-a7c0-c0c0789f29ae\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.293030 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5"] Dec 03 11:21:13 crc kubenswrapper[4702]: W1203 11:21:13.316818 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf40bd24_301e_4eb1_bbdb_84a55cd53cc9.slice/crio-3912ae84da79ed0619b28a90f9c8816b1f85a8c9c74ebe73be06897ab30020ca WatchSource:0}: Error finding container 3912ae84da79ed0619b28a90f9c8816b1f85a8c9c74ebe73be06897ab30020ca: Status 404 returned error can't find the container with id 3912ae84da79ed0619b28a90f9c8816b1f85a8c9c74ebe73be06897ab30020ca Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.441895 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" Dec 03 11:21:13 crc kubenswrapper[4702]: W1203 11:21:13.526738 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96a2e67f_2c70_4ac9_992f_b19afb498a1f.slice/crio-161c87e3123360094b0f36b55228092535fdf10a8c53fdf02d94c214d53d3864 WatchSource:0}: Error finding container 161c87e3123360094b0f36b55228092535fdf10a8c53fdf02d94c214d53d3864: Status 404 returned error can't find the container with id 161c87e3123360094b0f36b55228092535fdf10a8c53fdf02d94c214d53d3864 Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.527180 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-769cf4984d-hvjnw"] Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.579409 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6"] Dec 03 11:21:13 crc kubenswrapper[4702]: W1203 11:21:13.587360 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9370f81f_7868_4a16_9cec_7786257cdcbd.slice/crio-5c90595377fb0d42f25b5267a0fbaa82c1a9d4fd4ecb9c3d7e06c87ac9a19bbf WatchSource:0}: Error finding container 5c90595377fb0d42f25b5267a0fbaa82c1a9d4fd4ecb9c3d7e06c87ac9a19bbf: Status 404 returned error can't find the container with id 5c90595377fb0d42f25b5267a0fbaa82c1a9d4fd4ecb9c3d7e06c87ac9a19bbf Dec 03 11:21:13 crc kubenswrapper[4702]: I1203 11:21:13.884328 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz"] Dec 03 11:21:14 crc kubenswrapper[4702]: I1203 11:21:14.084939 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" event={"ID":"cf40bd24-301e-4eb1-bbdb-84a55cd53cc9","Type":"ContainerStarted","Data":"3912ae84da79ed0619b28a90f9c8816b1f85a8c9c74ebe73be06897ab30020ca"} Dec 03 11:21:14 crc kubenswrapper[4702]: I1203 11:21:14.086253 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" event={"ID":"908a9238-0a36-40c1-a7c0-c0c0789f29ae","Type":"ContainerStarted","Data":"3248e95a64cc2b4210d9a4fb9a1f7de08d0f495e18148d2d1a31d46fd63196f8"} Dec 03 11:21:14 crc kubenswrapper[4702]: I1203 11:21:14.092573 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" event={"ID":"9370f81f-7868-4a16-9cec-7786257cdcbd","Type":"ContainerStarted","Data":"5c90595377fb0d42f25b5267a0fbaa82c1a9d4fd4ecb9c3d7e06c87ac9a19bbf"} Dec 03 11:21:14 crc kubenswrapper[4702]: I1203 11:21:14.094601 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769cf4984d-hvjnw" event={"ID":"96a2e67f-2c70-4ac9-992f-b19afb498a1f","Type":"ContainerStarted","Data":"8f133bb4e9ce8c7b68661c304c99af01d55c46b4502326c9667f89fc48aa1724"} Dec 03 11:21:14 crc kubenswrapper[4702]: I1203 11:21:14.094633 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769cf4984d-hvjnw" event={"ID":"96a2e67f-2c70-4ac9-992f-b19afb498a1f","Type":"ContainerStarted","Data":"161c87e3123360094b0f36b55228092535fdf10a8c53fdf02d94c214d53d3864"} Dec 03 11:21:14 crc kubenswrapper[4702]: I1203 11:21:14.118552 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-769cf4984d-hvjnw" podStartSLOduration=2.118533192 podStartE2EDuration="2.118533192s" podCreationTimestamp="2025-12-03 11:21:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:21:14.114863699 +0000 UTC m=+1057.950792163" watchObservedRunningTime="2025-12-03 11:21:14.118533192 +0000 UTC m=+1057.954461656" Dec 03 11:21:16 crc kubenswrapper[4702]: I1203 11:21:16.125322 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" event={"ID":"cf40bd24-301e-4eb1-bbdb-84a55cd53cc9","Type":"ContainerStarted","Data":"e3d360bba509c281b10466f349e9f565435dfacbdf9bd59bb9d4dfed7dab6dc9"} Dec 03 11:21:16 crc kubenswrapper[4702]: I1203 11:21:16.129892 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" event={"ID":"9370f81f-7868-4a16-9cec-7786257cdcbd","Type":"ContainerStarted","Data":"b3e470aaf63eaf81b48db19d2ec0aa9a2691a367ba65efceb4f8dbd88b89c195"} Dec 03 11:21:16 crc kubenswrapper[4702]: I1203 11:21:16.130357 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:16 crc kubenswrapper[4702]: I1203 11:21:16.132101 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-zxwqw" event={"ID":"9ec2138c-eb31-401f-b62d-d2823fe0523f","Type":"ContainerStarted","Data":"5eecfd7c3f57439e9195d6ed0594fe4aa9caa7f00d820214e96f7ba37c2c77ed"} Dec 03 11:21:16 crc kubenswrapper[4702]: I1203 11:21:16.133158 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:16 crc kubenswrapper[4702]: I1203 11:21:16.157320 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" podStartSLOduration=2.429107576 podStartE2EDuration="4.157292272s" podCreationTimestamp="2025-12-03 11:21:12 +0000 UTC" firstStartedPulling="2025-12-03 11:21:13.589948677 +0000 UTC m=+1057.425877141" lastFinishedPulling="2025-12-03 11:21:15.318133373 +0000 UTC m=+1059.154061837" observedRunningTime="2025-12-03 11:21:16.150318115 +0000 UTC m=+1059.986246599" watchObservedRunningTime="2025-12-03 11:21:16.157292272 +0000 UTC m=+1059.993220736" Dec 03 11:21:16 crc kubenswrapper[4702]: I1203 11:21:16.959193 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-zxwqw" podStartSLOduration=2.477944073 podStartE2EDuration="4.959171009s" podCreationTimestamp="2025-12-03 11:21:12 +0000 UTC" firstStartedPulling="2025-12-03 11:21:12.8252961 +0000 UTC m=+1056.661224564" lastFinishedPulling="2025-12-03 11:21:15.306523046 +0000 UTC m=+1059.142451500" observedRunningTime="2025-12-03 11:21:16.168635162 +0000 UTC m=+1060.004563636" watchObservedRunningTime="2025-12-03 11:21:16.959171009 +0000 UTC m=+1060.795099483" Dec 03 11:21:17 crc kubenswrapper[4702]: I1203 11:21:17.145748 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" event={"ID":"908a9238-0a36-40c1-a7c0-c0c0789f29ae","Type":"ContainerStarted","Data":"9bfffb779f0e0a1f0216bace9c101ba10750b381264bd4e9362be604667f71a7"} Dec 03 11:21:17 crc kubenswrapper[4702]: I1203 11:21:17.165908 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-dmrtz" podStartSLOduration=2.337589213 podStartE2EDuration="5.165886702s" podCreationTimestamp="2025-12-03 11:21:12 +0000 UTC" firstStartedPulling="2025-12-03 11:21:13.897397282 +0000 UTC m=+1057.733325746" lastFinishedPulling="2025-12-03 11:21:16.725694771 +0000 UTC m=+1060.561623235" observedRunningTime="2025-12-03 11:21:17.162824256 +0000 UTC m=+1060.998752720" watchObservedRunningTime="2025-12-03 11:21:17.165886702 +0000 UTC m=+1061.001815166" Dec 03 11:21:19 crc kubenswrapper[4702]: I1203 11:21:19.167499 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" event={"ID":"cf40bd24-301e-4eb1-bbdb-84a55cd53cc9","Type":"ContainerStarted","Data":"625e4880065986e55a5b9e25898f320a6d8fbe81aa2775648ca4420e65d83efd"} Dec 03 11:21:22 crc kubenswrapper[4702]: I1203 11:21:22.781668 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-zxwqw" Dec 03 11:21:22 crc kubenswrapper[4702]: I1203 11:21:22.801585 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-6jgh5" podStartSLOduration=5.882852622 podStartE2EDuration="10.801558138s" podCreationTimestamp="2025-12-03 11:21:12 +0000 UTC" firstStartedPulling="2025-12-03 11:21:13.320238676 +0000 UTC m=+1057.156167140" lastFinishedPulling="2025-12-03 11:21:18.238944192 +0000 UTC m=+1062.074872656" observedRunningTime="2025-12-03 11:21:19.193305552 +0000 UTC m=+1063.029234016" watchObservedRunningTime="2025-12-03 11:21:22.801558138 +0000 UTC m=+1066.637486602" Dec 03 11:21:23 crc kubenswrapper[4702]: I1203 11:21:23.107167 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:23 crc kubenswrapper[4702]: I1203 11:21:23.107249 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:23 crc kubenswrapper[4702]: I1203 11:21:23.113595 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:23 crc kubenswrapper[4702]: I1203 11:21:23.203310 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:21:23 crc kubenswrapper[4702]: I1203 11:21:23.261923 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d5fd7569-jhhw8"] Dec 03 11:21:25 crc kubenswrapper[4702]: I1203 11:21:25.908030 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:21:25 crc kubenswrapper[4702]: I1203 11:21:25.908472 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:21:25 crc kubenswrapper[4702]: I1203 11:21:25.908546 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:21:25 crc kubenswrapper[4702]: I1203 11:21:25.909511 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1331b116dc6549b4ff553b30c2ad8688204f150ccd437e40dc657307e56d7aa3"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:21:25 crc kubenswrapper[4702]: I1203 11:21:25.909589 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://1331b116dc6549b4ff553b30c2ad8688204f150ccd437e40dc657307e56d7aa3" gracePeriod=600 Dec 03 11:21:27 crc kubenswrapper[4702]: I1203 11:21:27.232887 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="1331b116dc6549b4ff553b30c2ad8688204f150ccd437e40dc657307e56d7aa3" exitCode=0 Dec 03 11:21:27 crc kubenswrapper[4702]: I1203 11:21:27.232943 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"1331b116dc6549b4ff553b30c2ad8688204f150ccd437e40dc657307e56d7aa3"} Dec 03 11:21:27 crc kubenswrapper[4702]: I1203 11:21:27.233318 4702 scope.go:117] "RemoveContainer" containerID="77511b1fe86d0b6a8fc57584220c3eed3f6f18f178f53eb56a078f9c63c86b1e" Dec 03 11:21:28 crc kubenswrapper[4702]: I1203 11:21:28.245107 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"3ceadc6ee9df8857f4d601383951a406af62e876eb34d70ea87811eb3741ae2d"} Dec 03 11:21:33 crc kubenswrapper[4702]: I1203 11:21:33.034586 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 11:21:48 crc kubenswrapper[4702]: I1203 11:21:48.312216 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-64d5fd7569-jhhw8" podUID="f940bfbe-1e27-433f-836b-7b542814b39d" containerName="console" containerID="cri-o://f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6" gracePeriod=15 Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.345279 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj"] Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.350173 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.354049 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.358076 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj"] Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.422825 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d5fd7569-jhhw8_f940bfbe-1e27-433f-836b-7b542814b39d/console/0.log" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.422933 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.438293 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.438434 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.438466 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l5jv\" (UniqueName: \"kubernetes.io/projected/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-kube-api-access-6l5jv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.456343 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d5fd7569-jhhw8_f940bfbe-1e27-433f-836b-7b542814b39d/console/0.log" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.456387 4702 generic.go:334] "Generic (PLEG): container finished" podID="f940bfbe-1e27-433f-836b-7b542814b39d" containerID="f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6" exitCode=2 Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.456440 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d5fd7569-jhhw8" event={"ID":"f940bfbe-1e27-433f-836b-7b542814b39d","Type":"ContainerDied","Data":"f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6"} Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.456473 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d5fd7569-jhhw8" event={"ID":"f940bfbe-1e27-433f-836b-7b542814b39d","Type":"ContainerDied","Data":"5da2e7d2ef24badc1d4011e3eac9c7e3516f7b909434426cc0ac867740089c96"} Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.456534 4702 scope.go:117] "RemoveContainer" containerID="f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.456725 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d5fd7569-jhhw8" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.480957 4702 scope.go:117] "RemoveContainer" containerID="f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6" Dec 03 11:21:49 crc kubenswrapper[4702]: E1203 11:21:49.481695 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6\": container with ID starting with f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6 not found: ID does not exist" containerID="f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.481767 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6"} err="failed to get container status \"f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6\": rpc error: code = NotFound desc = could not find container \"f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6\": container with ID starting with f6fd35002b7ef3f6fe805635190adc561b3c2692c9a3aae9c1f52177e206dda6 not found: ID does not exist" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.539403 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-console-config\") pod \"f940bfbe-1e27-433f-836b-7b542814b39d\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.539529 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-oauth-serving-cert\") pod \"f940bfbe-1e27-433f-836b-7b542814b39d\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.539593 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-serving-cert\") pod \"f940bfbe-1e27-433f-836b-7b542814b39d\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.539691 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8ffq\" (UniqueName: \"kubernetes.io/projected/f940bfbe-1e27-433f-836b-7b542814b39d-kube-api-access-f8ffq\") pod \"f940bfbe-1e27-433f-836b-7b542814b39d\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.539749 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-trusted-ca-bundle\") pod \"f940bfbe-1e27-433f-836b-7b542814b39d\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.539864 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-oauth-config\") pod \"f940bfbe-1e27-433f-836b-7b542814b39d\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.539978 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-service-ca\") pod \"f940bfbe-1e27-433f-836b-7b542814b39d\" (UID: \"f940bfbe-1e27-433f-836b-7b542814b39d\") " Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.540252 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.540328 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.540358 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l5jv\" (UniqueName: \"kubernetes.io/projected/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-kube-api-access-6l5jv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.541808 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.542011 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.542686 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-service-ca" (OuterVolumeSpecName: "service-ca") pod "f940bfbe-1e27-433f-836b-7b542814b39d" (UID: "f940bfbe-1e27-433f-836b-7b542814b39d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.542736 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f940bfbe-1e27-433f-836b-7b542814b39d" (UID: "f940bfbe-1e27-433f-836b-7b542814b39d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.542954 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f940bfbe-1e27-433f-836b-7b542814b39d" (UID: "f940bfbe-1e27-433f-836b-7b542814b39d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.543121 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-console-config" (OuterVolumeSpecName: "console-config") pod "f940bfbe-1e27-433f-836b-7b542814b39d" (UID: "f940bfbe-1e27-433f-836b-7b542814b39d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.549940 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f940bfbe-1e27-433f-836b-7b542814b39d" (UID: "f940bfbe-1e27-433f-836b-7b542814b39d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.550096 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f940bfbe-1e27-433f-836b-7b542814b39d" (UID: "f940bfbe-1e27-433f-836b-7b542814b39d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.557263 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f940bfbe-1e27-433f-836b-7b542814b39d-kube-api-access-f8ffq" (OuterVolumeSpecName: "kube-api-access-f8ffq") pod "f940bfbe-1e27-433f-836b-7b542814b39d" (UID: "f940bfbe-1e27-433f-836b-7b542814b39d"). InnerVolumeSpecName "kube-api-access-f8ffq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.560806 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l5jv\" (UniqueName: \"kubernetes.io/projected/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-kube-api-access-6l5jv\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.768849 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.784858 4702 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.784926 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8ffq\" (UniqueName: \"kubernetes.io/projected/f940bfbe-1e27-433f-836b-7b542814b39d-kube-api-access-f8ffq\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.784941 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.784954 4702 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f940bfbe-1e27-433f-836b-7b542814b39d-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.784966 4702 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.784988 4702 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.785017 4702 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f940bfbe-1e27-433f-836b-7b542814b39d-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.945358 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d5fd7569-jhhw8"] Dec 03 11:21:49 crc kubenswrapper[4702]: I1203 11:21:49.951637 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64d5fd7569-jhhw8"] Dec 03 11:21:50 crc kubenswrapper[4702]: I1203 11:21:50.345461 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj"] Dec 03 11:21:50 crc kubenswrapper[4702]: I1203 11:21:50.560365 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" event={"ID":"3638051c-7f0b-4e32-81c4-e6e327fc5a8b","Type":"ContainerStarted","Data":"ba6d24eb7857826533b3a3d1665375b7a3ccd0556bc4ab9ba9e1478e24da5288"} Dec 03 11:21:50 crc kubenswrapper[4702]: I1203 11:21:50.940769 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f940bfbe-1e27-433f-836b-7b542814b39d" path="/var/lib/kubelet/pods/f940bfbe-1e27-433f-836b-7b542814b39d/volumes" Dec 03 11:21:51 crc kubenswrapper[4702]: I1203 11:21:51.588329 4702 generic.go:334] "Generic (PLEG): container finished" podID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerID="2ad9db7e2e473cf9d2d3bde548c63b6b384152d661cddbfcf901bae632bc7d70" exitCode=0 Dec 03 11:21:51 crc kubenswrapper[4702]: I1203 11:21:51.588422 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" event={"ID":"3638051c-7f0b-4e32-81c4-e6e327fc5a8b","Type":"ContainerDied","Data":"2ad9db7e2e473cf9d2d3bde548c63b6b384152d661cddbfcf901bae632bc7d70"} Dec 03 11:21:54 crc kubenswrapper[4702]: I1203 11:21:54.692231 4702 generic.go:334] "Generic (PLEG): container finished" podID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerID="b8ca9567ff77d514aaf3a34b346a290b3b82757727f7b1abda054f792797fdb5" exitCode=0 Dec 03 11:21:54 crc kubenswrapper[4702]: I1203 11:21:54.692316 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" event={"ID":"3638051c-7f0b-4e32-81c4-e6e327fc5a8b","Type":"ContainerDied","Data":"b8ca9567ff77d514aaf3a34b346a290b3b82757727f7b1abda054f792797fdb5"} Dec 03 11:21:55 crc kubenswrapper[4702]: I1203 11:21:55.704059 4702 generic.go:334] "Generic (PLEG): container finished" podID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerID="0e0062c3885497b58b72969a7a6e92dc6cbbf3b0251da24c6579cd485df9098f" exitCode=0 Dec 03 11:21:55 crc kubenswrapper[4702]: I1203 11:21:55.704120 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" event={"ID":"3638051c-7f0b-4e32-81c4-e6e327fc5a8b","Type":"ContainerDied","Data":"0e0062c3885497b58b72969a7a6e92dc6cbbf3b0251da24c6579cd485df9098f"} Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.111115 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.268202 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-bundle\") pod \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.268866 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-util\") pod \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.269048 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l5jv\" (UniqueName: \"kubernetes.io/projected/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-kube-api-access-6l5jv\") pod \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\" (UID: \"3638051c-7f0b-4e32-81c4-e6e327fc5a8b\") " Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.270459 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-bundle" (OuterVolumeSpecName: "bundle") pod "3638051c-7f0b-4e32-81c4-e6e327fc5a8b" (UID: "3638051c-7f0b-4e32-81c4-e6e327fc5a8b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.276403 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-kube-api-access-6l5jv" (OuterVolumeSpecName: "kube-api-access-6l5jv") pod "3638051c-7f0b-4e32-81c4-e6e327fc5a8b" (UID: "3638051c-7f0b-4e32-81c4-e6e327fc5a8b"). InnerVolumeSpecName "kube-api-access-6l5jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.282395 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-util" (OuterVolumeSpecName: "util") pod "3638051c-7f0b-4e32-81c4-e6e327fc5a8b" (UID: "3638051c-7f0b-4e32-81c4-e6e327fc5a8b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.371100 4702 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-util\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.371149 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l5jv\" (UniqueName: \"kubernetes.io/projected/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-kube-api-access-6l5jv\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.371166 4702 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3638051c-7f0b-4e32-81c4-e6e327fc5a8b-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.723840 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" event={"ID":"3638051c-7f0b-4e32-81c4-e6e327fc5a8b","Type":"ContainerDied","Data":"ba6d24eb7857826533b3a3d1665375b7a3ccd0556bc4ab9ba9e1478e24da5288"} Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.723904 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6d24eb7857826533b3a3d1665375b7a3ccd0556bc4ab9ba9e1478e24da5288" Dec 03 11:21:57 crc kubenswrapper[4702]: I1203 11:21:57.723921 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.895733 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t"] Dec 03 11:22:07 crc kubenswrapper[4702]: E1203 11:22:07.896993 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerName="extract" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.897012 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerName="extract" Dec 03 11:22:07 crc kubenswrapper[4702]: E1203 11:22:07.897035 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerName="util" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.897043 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerName="util" Dec 03 11:22:07 crc kubenswrapper[4702]: E1203 11:22:07.897059 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerName="pull" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.897066 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerName="pull" Dec 03 11:22:07 crc kubenswrapper[4702]: E1203 11:22:07.897075 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f940bfbe-1e27-433f-836b-7b542814b39d" containerName="console" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.897082 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f940bfbe-1e27-433f-836b-7b542814b39d" containerName="console" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.897259 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="3638051c-7f0b-4e32-81c4-e6e327fc5a8b" containerName="extract" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.897292 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f940bfbe-1e27-433f-836b-7b542814b39d" containerName="console" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.898264 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.904367 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.904688 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-snfxk" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.904863 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.906111 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.911934 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 11:22:07 crc kubenswrapper[4702]: I1203 11:22:07.912893 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t"] Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.039638 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e99cffd-b82e-46c9-8cbd-fe8c24507385-apiservice-cert\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.040046 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e99cffd-b82e-46c9-8cbd-fe8c24507385-webhook-cert\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.040090 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fp79\" (UniqueName: \"kubernetes.io/projected/6e99cffd-b82e-46c9-8cbd-fe8c24507385-kube-api-access-8fp79\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.142053 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e99cffd-b82e-46c9-8cbd-fe8c24507385-apiservice-cert\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.142210 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e99cffd-b82e-46c9-8cbd-fe8c24507385-webhook-cert\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.142288 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fp79\" (UniqueName: \"kubernetes.io/projected/6e99cffd-b82e-46c9-8cbd-fe8c24507385-kube-api-access-8fp79\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.150600 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6e99cffd-b82e-46c9-8cbd-fe8c24507385-webhook-cert\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.151272 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6e99cffd-b82e-46c9-8cbd-fe8c24507385-apiservice-cert\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.180932 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fp79\" (UniqueName: \"kubernetes.io/projected/6e99cffd-b82e-46c9-8cbd-fe8c24507385-kube-api-access-8fp79\") pod \"metallb-operator-controller-manager-85d7874b49-jvs5t\" (UID: \"6e99cffd-b82e-46c9-8cbd-fe8c24507385\") " pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.233865 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq"] Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.233909 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.235995 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.239237 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.239269 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-p7xkq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.239292 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.243527 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-webhook-cert\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.243577 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg9kc\" (UniqueName: \"kubernetes.io/projected/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-kube-api-access-wg9kc\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.243664 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-apiservice-cert\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.261788 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq"] Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.414947 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-webhook-cert\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.415018 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg9kc\" (UniqueName: \"kubernetes.io/projected/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-kube-api-access-wg9kc\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.415078 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-apiservice-cert\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.436457 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-apiservice-cert\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.458588 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-webhook-cert\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.504383 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg9kc\" (UniqueName: \"kubernetes.io/projected/2cb93136-1d69-4bc8-9c42-aee1f6638aa6-kube-api-access-wg9kc\") pod \"metallb-operator-webhook-server-66c548d864-tr7qq\" (UID: \"2cb93136-1d69-4bc8-9c42-aee1f6638aa6\") " pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:08 crc kubenswrapper[4702]: I1203 11:22:08.507733 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:09 crc kubenswrapper[4702]: I1203 11:22:09.353341 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t"] Dec 03 11:22:09 crc kubenswrapper[4702]: I1203 11:22:09.401891 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq"] Dec 03 11:22:09 crc kubenswrapper[4702]: W1203 11:22:09.425902 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cb93136_1d69_4bc8_9c42_aee1f6638aa6.slice/crio-637aacef5e0d2808a37e6f589759e84ea3f7a51b028c59743bfc09724b74c207 WatchSource:0}: Error finding container 637aacef5e0d2808a37e6f589759e84ea3f7a51b028c59743bfc09724b74c207: Status 404 returned error can't find the container with id 637aacef5e0d2808a37e6f589759e84ea3f7a51b028c59743bfc09724b74c207 Dec 03 11:22:09 crc kubenswrapper[4702]: I1203 11:22:09.940985 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" event={"ID":"2cb93136-1d69-4bc8-9c42-aee1f6638aa6","Type":"ContainerStarted","Data":"637aacef5e0d2808a37e6f589759e84ea3f7a51b028c59743bfc09724b74c207"} Dec 03 11:22:09 crc kubenswrapper[4702]: I1203 11:22:09.946979 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" event={"ID":"6e99cffd-b82e-46c9-8cbd-fe8c24507385","Type":"ContainerStarted","Data":"af0d8b7cf674a63adc3a58c48c1f42e44370aa053407cd8fbf73d47176f6e0e7"} Dec 03 11:22:27 crc kubenswrapper[4702]: E1203 11:22:27.683780 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/metallb-rhel9@sha256:afa5a50746f3d69cef22c41c612ce3e7fe91e1da1d1d1566dee42ee304132379" Dec 03 11:22:27 crc kubenswrapper[4702]: E1203 11:22:27.685234 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:webhook-server,Image:registry.redhat.io/openshift4/metallb-rhel9@sha256:afa5a50746f3d69cef22c41c612ce3e7fe91e1da1d1d1566dee42ee304132379,Command:[/controller],Args:[--disable-cert-rotation=true --port=7472 --log-level=info --webhook-mode=onlywebhook],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:monitoring,HostPort:0,ContainerPort:7472,Protocol:TCP,HostIP:,},ContainerPort{Name:webhook-server,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:METALLB_BGP_TYPE,Value:frr,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:metallb-operator.v4.18.0-202511181540,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wg9kc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/metrics,Port:{1 0 monitoring},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000730000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod metallb-operator-webhook-server-66c548d864-tr7qq_metallb-system(2cb93136-1d69-4bc8-9c42-aee1f6638aa6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:22:27 crc kubenswrapper[4702]: E1203 11:22:27.686940 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" Dec 03 11:22:27 crc kubenswrapper[4702]: E1203 11:22:27.904838 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"webhook-server\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/metallb-rhel9@sha256:afa5a50746f3d69cef22c41c612ce3e7fe91e1da1d1d1566dee42ee304132379\\\"\"" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" Dec 03 11:22:29 crc kubenswrapper[4702]: I1203 11:22:29.135333 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" event={"ID":"6e99cffd-b82e-46c9-8cbd-fe8c24507385","Type":"ContainerStarted","Data":"1a4aee4e3ee6bb158f231c543ab42ab038db6c887dc5b836c478eebbc236650b"} Dec 03 11:22:29 crc kubenswrapper[4702]: I1203 11:22:29.135968 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:29 crc kubenswrapper[4702]: I1203 11:22:29.226031 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" podStartSLOduration=3.8469115350000003 podStartE2EDuration="22.225967898s" podCreationTimestamp="2025-12-03 11:22:07 +0000 UTC" firstStartedPulling="2025-12-03 11:22:09.33547674 +0000 UTC m=+1113.171405204" lastFinishedPulling="2025-12-03 11:22:27.714533103 +0000 UTC m=+1131.550461567" observedRunningTime="2025-12-03 11:22:29.216561501 +0000 UTC m=+1133.052489975" watchObservedRunningTime="2025-12-03 11:22:29.225967898 +0000 UTC m=+1133.061896362" Dec 03 11:22:42 crc kubenswrapper[4702]: I1203 11:22:42.254121 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" event={"ID":"2cb93136-1d69-4bc8-9c42-aee1f6638aa6","Type":"ContainerStarted","Data":"958529641adcb16dde835961452da3fb825961812b39cf59f9126e93469f5643"} Dec 03 11:22:42 crc kubenswrapper[4702]: I1203 11:22:42.254877 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:42 crc kubenswrapper[4702]: I1203 11:22:42.280645 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podStartSLOduration=1.729065195 podStartE2EDuration="34.280625707s" podCreationTimestamp="2025-12-03 11:22:08 +0000 UTC" firstStartedPulling="2025-12-03 11:22:09.430235231 +0000 UTC m=+1113.266163695" lastFinishedPulling="2025-12-03 11:22:41.981795743 +0000 UTC m=+1145.817724207" observedRunningTime="2025-12-03 11:22:42.277133978 +0000 UTC m=+1146.113062442" watchObservedRunningTime="2025-12-03 11:22:42.280625707 +0000 UTC m=+1146.116554171" Dec 03 11:22:58 crc kubenswrapper[4702]: I1203 11:22:58.237396 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.112217 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.519784 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-wh75l"] Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.523936 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.526146 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-549gl" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.526805 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.527110 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.530284 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5"] Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.534256 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.537126 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.551810 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5"] Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.620735 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bv8pf"] Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.622429 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.625445 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.625467 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-cphv4" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.625752 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.625949 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.647063 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-2npsf"] Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.648654 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.650853 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.676328 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-2npsf"] Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.700961 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-metrics\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.701401 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-frr-sockets\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.701570 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7643d370-6497-4a94-b0e7-2db66b56b687-frr-startup\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.701775 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3a5cd30-f098-4e9c-bbb0-f45305893017-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-slqp5\" (UID: \"b3a5cd30-f098-4e9c-bbb0-f45305893017\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.701934 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-frr-conf\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.702094 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-reloader\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.702960 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmxj7\" (UniqueName: \"kubernetes.io/projected/7643d370-6497-4a94-b0e7-2db66b56b687-kube-api-access-wmxj7\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.703167 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjhw\" (UniqueName: \"kubernetes.io/projected/b3a5cd30-f098-4e9c-bbb0-f45305893017-kube-api-access-mxjhw\") pod \"frr-k8s-webhook-server-7fcb986d4-slqp5\" (UID: \"b3a5cd30-f098-4e9c-bbb0-f45305893017\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.703233 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7643d370-6497-4a94-b0e7-2db66b56b687-metrics-certs\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805045 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7643d370-6497-4a94-b0e7-2db66b56b687-frr-startup\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805096 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3a5cd30-f098-4e9c-bbb0-f45305893017-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-slqp5\" (UID: \"b3a5cd30-f098-4e9c-bbb0-f45305893017\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805119 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-frr-conf\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805141 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-reloader\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805164 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-cert\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805197 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmxj7\" (UniqueName: \"kubernetes.io/projected/7643d370-6497-4a94-b0e7-2db66b56b687-kube-api-access-wmxj7\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805241 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805281 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjhw\" (UniqueName: \"kubernetes.io/projected/b3a5cd30-f098-4e9c-bbb0-f45305893017-kube-api-access-mxjhw\") pod \"frr-k8s-webhook-server-7fcb986d4-slqp5\" (UID: \"b3a5cd30-f098-4e9c-bbb0-f45305893017\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805321 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpc5d\" (UniqueName: \"kubernetes.io/projected/4be204bf-b480-4d77-9ced-34c6668afa14-kube-api-access-tpc5d\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805344 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7643d370-6497-4a94-b0e7-2db66b56b687-metrics-certs\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805362 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hzbj\" (UniqueName: \"kubernetes.io/projected/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-kube-api-access-9hzbj\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805389 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-metrics-certs\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805491 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-metrics-certs\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805515 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4be204bf-b480-4d77-9ced-34c6668afa14-metallb-excludel2\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805563 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-metrics\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805589 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-frr-sockets\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.805975 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-frr-sockets\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.806285 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-frr-conf\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.806441 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7643d370-6497-4a94-b0e7-2db66b56b687-frr-startup\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.806538 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-reloader\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.806915 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7643d370-6497-4a94-b0e7-2db66b56b687-metrics\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.811997 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7643d370-6497-4a94-b0e7-2db66b56b687-metrics-certs\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.814372 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b3a5cd30-f098-4e9c-bbb0-f45305893017-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-slqp5\" (UID: \"b3a5cd30-f098-4e9c-bbb0-f45305893017\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.827363 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmxj7\" (UniqueName: \"kubernetes.io/projected/7643d370-6497-4a94-b0e7-2db66b56b687-kube-api-access-wmxj7\") pod \"frr-k8s-wh75l\" (UID: \"7643d370-6497-4a94-b0e7-2db66b56b687\") " pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.831970 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjhw\" (UniqueName: \"kubernetes.io/projected/b3a5cd30-f098-4e9c-bbb0-f45305893017-kube-api-access-mxjhw\") pod \"frr-k8s-webhook-server-7fcb986d4-slqp5\" (UID: \"b3a5cd30-f098-4e9c-bbb0-f45305893017\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.852710 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-wh75l" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.861653 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.907030 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.907077 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpc5d\" (UniqueName: \"kubernetes.io/projected/4be204bf-b480-4d77-9ced-34c6668afa14-kube-api-access-tpc5d\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.907100 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hzbj\" (UniqueName: \"kubernetes.io/projected/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-kube-api-access-9hzbj\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.907127 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-metrics-certs\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.907184 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-metrics-certs\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.907198 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4be204bf-b480-4d77-9ced-34c6668afa14-metallb-excludel2\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: E1203 11:22:59.907200 4702 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.907241 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-cert\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: E1203 11:22:59.907277 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist podName:4be204bf-b480-4d77-9ced-34c6668afa14 nodeName:}" failed. No retries permitted until 2025-12-03 11:23:00.407252877 +0000 UTC m=+1164.243181341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist") pod "speaker-bv8pf" (UID: "4be204bf-b480-4d77-9ced-34c6668afa14") : secret "metallb-memberlist" not found Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.908456 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4be204bf-b480-4d77-9ced-34c6668afa14-metallb-excludel2\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: E1203 11:22:59.908524 4702 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 11:22:59 crc kubenswrapper[4702]: E1203 11:22:59.908552 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-metrics-certs podName:4be204bf-b480-4d77-9ced-34c6668afa14 nodeName:}" failed. No retries permitted until 2025-12-03 11:23:00.408543324 +0000 UTC m=+1164.244471788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-metrics-certs") pod "speaker-bv8pf" (UID: "4be204bf-b480-4d77-9ced-34c6668afa14") : secret "speaker-certs-secret" not found Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.910315 4702 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.912469 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-metrics-certs\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.927179 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-cert\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.928915 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hzbj\" (UniqueName: \"kubernetes.io/projected/d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c-kube-api-access-9hzbj\") pod \"controller-f8648f98b-2npsf\" (UID: \"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c\") " pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.932064 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpc5d\" (UniqueName: \"kubernetes.io/projected/4be204bf-b480-4d77-9ced-34c6668afa14-kube-api-access-tpc5d\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:22:59 crc kubenswrapper[4702]: I1203 11:22:59.970937 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:23:00 crc kubenswrapper[4702]: I1203 11:23:00.117657 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerStarted","Data":"119c21e7dd476c9177d2ac5e4fde6f2508868ac2302c2a440f3f1e5b402171c9"} Dec 03 11:23:00 crc kubenswrapper[4702]: I1203 11:23:00.356695 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5"] Dec 03 11:23:00 crc kubenswrapper[4702]: W1203 11:23:00.388960 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3a5cd30_f098_4e9c_bbb0_f45305893017.slice/crio-22bddc86a8c8f1c4f9a6d553b57d71a93e03d265185d37dc933c5ca0a2f72661 WatchSource:0}: Error finding container 22bddc86a8c8f1c4f9a6d553b57d71a93e03d265185d37dc933c5ca0a2f72661: Status 404 returned error can't find the container with id 22bddc86a8c8f1c4f9a6d553b57d71a93e03d265185d37dc933c5ca0a2f72661 Dec 03 11:23:00 crc kubenswrapper[4702]: I1203 11:23:00.427790 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:23:00 crc kubenswrapper[4702]: I1203 11:23:00.427930 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-metrics-certs\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:23:00 crc kubenswrapper[4702]: E1203 11:23:00.427959 4702 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 11:23:00 crc kubenswrapper[4702]: E1203 11:23:00.428049 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist podName:4be204bf-b480-4d77-9ced-34c6668afa14 nodeName:}" failed. No retries permitted until 2025-12-03 11:23:01.428028079 +0000 UTC m=+1165.263956603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist") pod "speaker-bv8pf" (UID: "4be204bf-b480-4d77-9ced-34c6668afa14") : secret "metallb-memberlist" not found Dec 03 11:23:00 crc kubenswrapper[4702]: I1203 11:23:00.436045 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-metrics-certs\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:23:00 crc kubenswrapper[4702]: I1203 11:23:00.565586 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-2npsf"] Dec 03 11:23:00 crc kubenswrapper[4702]: W1203 11:23:00.589179 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c1b66f_fa02_4889_a7fe_f7fe0d467c7c.slice/crio-878f75ca2ee34d55fcb85082e6b687ae9c242ee707737f50f01d517c7f0e3a78 WatchSource:0}: Error finding container 878f75ca2ee34d55fcb85082e6b687ae9c242ee707737f50f01d517c7f0e3a78: Status 404 returned error can't find the container with id 878f75ca2ee34d55fcb85082e6b687ae9c242ee707737f50f01d517c7f0e3a78 Dec 03 11:23:01 crc kubenswrapper[4702]: I1203 11:23:01.143457 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-2npsf" event={"ID":"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c","Type":"ContainerStarted","Data":"878f75ca2ee34d55fcb85082e6b687ae9c242ee707737f50f01d517c7f0e3a78"} Dec 03 11:23:01 crc kubenswrapper[4702]: I1203 11:23:01.144569 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" event={"ID":"b3a5cd30-f098-4e9c-bbb0-f45305893017","Type":"ContainerStarted","Data":"22bddc86a8c8f1c4f9a6d553b57d71a93e03d265185d37dc933c5ca0a2f72661"} Dec 03 11:23:01 crc kubenswrapper[4702]: I1203 11:23:01.448037 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:23:01 crc kubenswrapper[4702]: I1203 11:23:01.472021 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4be204bf-b480-4d77-9ced-34c6668afa14-memberlist\") pod \"speaker-bv8pf\" (UID: \"4be204bf-b480-4d77-9ced-34c6668afa14\") " pod="metallb-system/speaker-bv8pf" Dec 03 11:23:01 crc kubenswrapper[4702]: I1203 11:23:01.765772 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bv8pf" Dec 03 11:23:01 crc kubenswrapper[4702]: W1203 11:23:01.855970 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4be204bf_b480_4d77_9ced_34c6668afa14.slice/crio-f7241732f36789029f1199dc9e9421ab05afa03487cbca23c8234887d98c87a5 WatchSource:0}: Error finding container f7241732f36789029f1199dc9e9421ab05afa03487cbca23c8234887d98c87a5: Status 404 returned error can't find the container with id f7241732f36789029f1199dc9e9421ab05afa03487cbca23c8234887d98c87a5 Dec 03 11:23:02 crc kubenswrapper[4702]: I1203 11:23:02.165641 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-2npsf" event={"ID":"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c","Type":"ContainerStarted","Data":"bdcd8574feb56a6495eb08b794c09ea1df6c5476ef29b0e3b699a4b304d8d3d2"} Dec 03 11:23:02 crc kubenswrapper[4702]: I1203 11:23:02.166090 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-2npsf" event={"ID":"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c","Type":"ContainerStarted","Data":"e1bc297d19a6929404aa86b33b8e2ccef3114dd77f28c5099de3bc066165fec0"} Dec 03 11:23:02 crc kubenswrapper[4702]: I1203 11:23:02.166110 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:23:02 crc kubenswrapper[4702]: I1203 11:23:02.170106 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bv8pf" event={"ID":"4be204bf-b480-4d77-9ced-34c6668afa14","Type":"ContainerStarted","Data":"9b6f385cb92b755ea3c8cd87f0ee9d9d2ee381047ee19125e24be8ca6e76a3d6"} Dec 03 11:23:02 crc kubenswrapper[4702]: I1203 11:23:02.170167 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bv8pf" event={"ID":"4be204bf-b480-4d77-9ced-34c6668afa14","Type":"ContainerStarted","Data":"f7241732f36789029f1199dc9e9421ab05afa03487cbca23c8234887d98c87a5"} Dec 03 11:23:02 crc kubenswrapper[4702]: I1203 11:23:02.199011 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-2npsf" podStartSLOduration=3.198989545 podStartE2EDuration="3.198989545s" podCreationTimestamp="2025-12-03 11:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:23:02.187595972 +0000 UTC m=+1166.023524436" watchObservedRunningTime="2025-12-03 11:23:02.198989545 +0000 UTC m=+1166.034918009" Dec 03 11:23:03 crc kubenswrapper[4702]: I1203 11:23:03.198966 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bv8pf" event={"ID":"4be204bf-b480-4d77-9ced-34c6668afa14","Type":"ContainerStarted","Data":"6166d10f735a273e008b2b4bfada9f9bbfd760b88f39efe7b2517cb5cade2f94"} Dec 03 11:23:03 crc kubenswrapper[4702]: I1203 11:23:03.199379 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bv8pf" Dec 03 11:23:03 crc kubenswrapper[4702]: I1203 11:23:03.216467 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bv8pf" podStartSLOduration=4.216443955 podStartE2EDuration="4.216443955s" podCreationTimestamp="2025-12-03 11:22:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:23:03.215103767 +0000 UTC m=+1167.051032231" watchObservedRunningTime="2025-12-03 11:23:03.216443955 +0000 UTC m=+1167.052372419" Dec 03 11:23:11 crc kubenswrapper[4702]: I1203 11:23:11.364567 4702 generic.go:334] "Generic (PLEG): container finished" podID="7643d370-6497-4a94-b0e7-2db66b56b687" containerID="4e7d1d2a50f6f14e59bdb96a4d387e2f11183e1c5f4d855fd3e12d3509649275" exitCode=0 Dec 03 11:23:11 crc kubenswrapper[4702]: I1203 11:23:11.365212 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerDied","Data":"4e7d1d2a50f6f14e59bdb96a4d387e2f11183e1c5f4d855fd3e12d3509649275"} Dec 03 11:23:11 crc kubenswrapper[4702]: I1203 11:23:11.374040 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" event={"ID":"b3a5cd30-f098-4e9c-bbb0-f45305893017","Type":"ContainerStarted","Data":"3932ae1e798d342df8eed400f6f500e6812790f9e559a4bb970c698a7e95080b"} Dec 03 11:23:11 crc kubenswrapper[4702]: I1203 11:23:11.375162 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:23:11 crc kubenswrapper[4702]: I1203 11:23:11.432216 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" podStartSLOduration=2.709780479 podStartE2EDuration="12.432172158s" podCreationTimestamp="2025-12-03 11:22:59 +0000 UTC" firstStartedPulling="2025-12-03 11:23:00.391476025 +0000 UTC m=+1164.227404489" lastFinishedPulling="2025-12-03 11:23:10.113867704 +0000 UTC m=+1173.949796168" observedRunningTime="2025-12-03 11:23:11.424124741 +0000 UTC m=+1175.260053215" watchObservedRunningTime="2025-12-03 11:23:11.432172158 +0000 UTC m=+1175.268100622" Dec 03 11:23:12 crc kubenswrapper[4702]: I1203 11:23:12.387031 4702 generic.go:334] "Generic (PLEG): container finished" podID="7643d370-6497-4a94-b0e7-2db66b56b687" containerID="ac153f2a22e96a577b35ad74b8f025a0b9e36435713c768a2de797e12c0bbec0" exitCode=0 Dec 03 11:23:12 crc kubenswrapper[4702]: I1203 11:23:12.387119 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerDied","Data":"ac153f2a22e96a577b35ad74b8f025a0b9e36435713c768a2de797e12c0bbec0"} Dec 03 11:23:12 crc kubenswrapper[4702]: E1203 11:23:12.836748 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7643d370_6497_4a94_b0e7_2db66b56b687.slice/crio-conmon-df3cd57546e8ad033b91ccbcb3ff7a30631ec4d24e0dea664f4c7c186f09f31e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7643d370_6497_4a94_b0e7_2db66b56b687.slice/crio-df3cd57546e8ad033b91ccbcb3ff7a30631ec4d24e0dea664f4c7c186f09f31e.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:23:13 crc kubenswrapper[4702]: I1203 11:23:13.432586 4702 generic.go:334] "Generic (PLEG): container finished" podID="7643d370-6497-4a94-b0e7-2db66b56b687" containerID="df3cd57546e8ad033b91ccbcb3ff7a30631ec4d24e0dea664f4c7c186f09f31e" exitCode=0 Dec 03 11:23:13 crc kubenswrapper[4702]: I1203 11:23:13.432672 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerDied","Data":"df3cd57546e8ad033b91ccbcb3ff7a30631ec4d24e0dea664f4c7c186f09f31e"} Dec 03 11:23:14 crc kubenswrapper[4702]: I1203 11:23:14.448902 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerStarted","Data":"b691c66c23cbac60c20ec2eeff46603a5d0144413a28372e37d3963f2c7f7f54"} Dec 03 11:23:14 crc kubenswrapper[4702]: I1203 11:23:14.449285 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerStarted","Data":"6c3456118d7ad02121ee6dd402a92fdc6fd333bbb01682676c3c4f9b5e0e024b"} Dec 03 11:23:14 crc kubenswrapper[4702]: I1203 11:23:14.449303 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerStarted","Data":"2e9901bec8566076e75f91059a59d717d36b63bac1c4bbb77850591b24d67bb2"} Dec 03 11:23:14 crc kubenswrapper[4702]: I1203 11:23:14.449314 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerStarted","Data":"77a4a7a90097f8f7d0f8a0ddd8172be0f7509652371e99e0849363629ca7e9d7"} Dec 03 11:23:15 crc kubenswrapper[4702]: I1203 11:23:15.676554 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerStarted","Data":"2f186982bb28c2e6531c4fc209239a28c655f3c070966445086d9277ceb58d3c"} Dec 03 11:23:15 crc kubenswrapper[4702]: I1203 11:23:15.677220 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-wh75l" Dec 03 11:23:15 crc kubenswrapper[4702]: I1203 11:23:15.677259 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerStarted","Data":"13b4bbeaea4d7e73e1d244172df92e1381f068383e3d8c335b49265f0454964a"} Dec 03 11:23:15 crc kubenswrapper[4702]: I1203 11:23:15.716928 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-wh75l" podStartSLOduration=6.691135217 podStartE2EDuration="16.716897658s" podCreationTimestamp="2025-12-03 11:22:59 +0000 UTC" firstStartedPulling="2025-12-03 11:23:00.089142483 +0000 UTC m=+1163.925070957" lastFinishedPulling="2025-12-03 11:23:10.114904934 +0000 UTC m=+1173.950833398" observedRunningTime="2025-12-03 11:23:15.711252648 +0000 UTC m=+1179.547181112" watchObservedRunningTime="2025-12-03 11:23:15.716897658 +0000 UTC m=+1179.552826132" Dec 03 11:23:19 crc kubenswrapper[4702]: I1203 11:23:19.853367 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wh75l" Dec 03 11:23:19 crc kubenswrapper[4702]: I1203 11:23:19.897408 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wh75l" Dec 03 11:23:19 crc kubenswrapper[4702]: I1203 11:23:19.979482 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 11:23:21 crc kubenswrapper[4702]: I1203 11:23:21.772985 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bv8pf" Dec 03 11:23:24 crc kubenswrapper[4702]: I1203 11:23:24.894833 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5fq6l"] Dec 03 11:23:24 crc kubenswrapper[4702]: I1203 11:23:24.897189 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5fq6l" Dec 03 11:23:24 crc kubenswrapper[4702]: I1203 11:23:24.906523 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-rdzh6" Dec 03 11:23:24 crc kubenswrapper[4702]: I1203 11:23:24.906875 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 11:23:24 crc kubenswrapper[4702]: I1203 11:23:24.907155 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 11:23:24 crc kubenswrapper[4702]: I1203 11:23:24.910700 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5fq6l"] Dec 03 11:23:24 crc kubenswrapper[4702]: I1203 11:23:24.948917 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx8nx\" (UniqueName: \"kubernetes.io/projected/98df982f-4154-487c-b324-d8c63e616127-kube-api-access-vx8nx\") pod \"openstack-operator-index-5fq6l\" (UID: \"98df982f-4154-487c-b324-d8c63e616127\") " pod="openstack-operators/openstack-operator-index-5fq6l" Dec 03 11:23:25 crc kubenswrapper[4702]: I1203 11:23:25.064316 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx8nx\" (UniqueName: \"kubernetes.io/projected/98df982f-4154-487c-b324-d8c63e616127-kube-api-access-vx8nx\") pod \"openstack-operator-index-5fq6l\" (UID: \"98df982f-4154-487c-b324-d8c63e616127\") " pod="openstack-operators/openstack-operator-index-5fq6l" Dec 03 11:23:25 crc kubenswrapper[4702]: I1203 11:23:25.090714 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx8nx\" (UniqueName: \"kubernetes.io/projected/98df982f-4154-487c-b324-d8c63e616127-kube-api-access-vx8nx\") pod \"openstack-operator-index-5fq6l\" (UID: \"98df982f-4154-487c-b324-d8c63e616127\") " pod="openstack-operators/openstack-operator-index-5fq6l" Dec 03 11:23:25 crc kubenswrapper[4702]: I1203 11:23:25.250177 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5fq6l" Dec 03 11:23:25 crc kubenswrapper[4702]: I1203 11:23:25.717408 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5fq6l"] Dec 03 11:23:25 crc kubenswrapper[4702]: I1203 11:23:25.761365 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5fq6l" event={"ID":"98df982f-4154-487c-b324-d8c63e616127","Type":"ContainerStarted","Data":"b500c4e2531bb427b6054b6e96a95e79e24764e2e5d9c67b18125d186db7ea94"} Dec 03 11:23:28 crc kubenswrapper[4702]: I1203 11:23:28.065866 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5fq6l"] Dec 03 11:23:28 crc kubenswrapper[4702]: I1203 11:23:28.676852 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ds4ss"] Dec 03 11:23:28 crc kubenswrapper[4702]: I1203 11:23:28.678223 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:28 crc kubenswrapper[4702]: I1203 11:23:28.700346 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ds4ss"] Dec 03 11:23:28 crc kubenswrapper[4702]: I1203 11:23:28.758382 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj7tm\" (UniqueName: \"kubernetes.io/projected/9432a2a8-8932-4734-a69d-8976764f1dab-kube-api-access-mj7tm\") pod \"openstack-operator-index-ds4ss\" (UID: \"9432a2a8-8932-4734-a69d-8976764f1dab\") " pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:28 crc kubenswrapper[4702]: I1203 11:23:28.860078 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj7tm\" (UniqueName: \"kubernetes.io/projected/9432a2a8-8932-4734-a69d-8976764f1dab-kube-api-access-mj7tm\") pod \"openstack-operator-index-ds4ss\" (UID: \"9432a2a8-8932-4734-a69d-8976764f1dab\") " pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:28 crc kubenswrapper[4702]: I1203 11:23:28.880872 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj7tm\" (UniqueName: \"kubernetes.io/projected/9432a2a8-8932-4734-a69d-8976764f1dab-kube-api-access-mj7tm\") pod \"openstack-operator-index-ds4ss\" (UID: \"9432a2a8-8932-4734-a69d-8976764f1dab\") " pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:29 crc kubenswrapper[4702]: I1203 11:23:29.003663 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:29 crc kubenswrapper[4702]: I1203 11:23:29.861379 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-wh75l" Dec 03 11:23:29 crc kubenswrapper[4702]: I1203 11:23:29.877826 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" Dec 03 11:23:30 crc kubenswrapper[4702]: W1203 11:23:30.047008 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9432a2a8_8932_4734_a69d_8976764f1dab.slice/crio-6f42c2b77dfda7327cde8db498da58f411fecd66d819559ea5c6cf60e623dc20 WatchSource:0}: Error finding container 6f42c2b77dfda7327cde8db498da58f411fecd66d819559ea5c6cf60e623dc20: Status 404 returned error can't find the container with id 6f42c2b77dfda7327cde8db498da58f411fecd66d819559ea5c6cf60e623dc20 Dec 03 11:23:30 crc kubenswrapper[4702]: I1203 11:23:30.050209 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ds4ss"] Dec 03 11:23:30 crc kubenswrapper[4702]: I1203 11:23:30.825034 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5fq6l" event={"ID":"98df982f-4154-487c-b324-d8c63e616127","Type":"ContainerStarted","Data":"38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db"} Dec 03 11:23:30 crc kubenswrapper[4702]: I1203 11:23:30.825726 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5fq6l" podUID="98df982f-4154-487c-b324-d8c63e616127" containerName="registry-server" containerID="cri-o://38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db" gracePeriod=2 Dec 03 11:23:30 crc kubenswrapper[4702]: I1203 11:23:30.829656 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ds4ss" event={"ID":"9432a2a8-8932-4734-a69d-8976764f1dab","Type":"ContainerStarted","Data":"c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89"} Dec 03 11:23:30 crc kubenswrapper[4702]: I1203 11:23:30.829726 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ds4ss" event={"ID":"9432a2a8-8932-4734-a69d-8976764f1dab","Type":"ContainerStarted","Data":"6f42c2b77dfda7327cde8db498da58f411fecd66d819559ea5c6cf60e623dc20"} Dec 03 11:23:30 crc kubenswrapper[4702]: I1203 11:23:30.846337 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5fq6l" podStartSLOduration=2.900462212 podStartE2EDuration="6.84631706s" podCreationTimestamp="2025-12-03 11:23:24 +0000 UTC" firstStartedPulling="2025-12-03 11:23:25.726219425 +0000 UTC m=+1189.562147889" lastFinishedPulling="2025-12-03 11:23:29.672074273 +0000 UTC m=+1193.508002737" observedRunningTime="2025-12-03 11:23:30.84171567 +0000 UTC m=+1194.677644134" watchObservedRunningTime="2025-12-03 11:23:30.84631706 +0000 UTC m=+1194.682245524" Dec 03 11:23:30 crc kubenswrapper[4702]: I1203 11:23:30.865921 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ds4ss" podStartSLOduration=2.792542878 podStartE2EDuration="2.865895353s" podCreationTimestamp="2025-12-03 11:23:28 +0000 UTC" firstStartedPulling="2025-12-03 11:23:30.052018431 +0000 UTC m=+1193.887946895" lastFinishedPulling="2025-12-03 11:23:30.125370906 +0000 UTC m=+1193.961299370" observedRunningTime="2025-12-03 11:23:30.864121753 +0000 UTC m=+1194.700050227" watchObservedRunningTime="2025-12-03 11:23:30.865895353 +0000 UTC m=+1194.701823807" Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.263984 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5fq6l" Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.453671 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx8nx\" (UniqueName: \"kubernetes.io/projected/98df982f-4154-487c-b324-d8c63e616127-kube-api-access-vx8nx\") pod \"98df982f-4154-487c-b324-d8c63e616127\" (UID: \"98df982f-4154-487c-b324-d8c63e616127\") " Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.461820 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98df982f-4154-487c-b324-d8c63e616127-kube-api-access-vx8nx" (OuterVolumeSpecName: "kube-api-access-vx8nx") pod "98df982f-4154-487c-b324-d8c63e616127" (UID: "98df982f-4154-487c-b324-d8c63e616127"). InnerVolumeSpecName "kube-api-access-vx8nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.555432 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx8nx\" (UniqueName: \"kubernetes.io/projected/98df982f-4154-487c-b324-d8c63e616127-kube-api-access-vx8nx\") on node \"crc\" DevicePath \"\"" Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.838498 4702 generic.go:334] "Generic (PLEG): container finished" podID="98df982f-4154-487c-b324-d8c63e616127" containerID="38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db" exitCode=0 Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.838567 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5fq6l" Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.838601 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5fq6l" event={"ID":"98df982f-4154-487c-b324-d8c63e616127","Type":"ContainerDied","Data":"38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db"} Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.838643 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5fq6l" event={"ID":"98df982f-4154-487c-b324-d8c63e616127","Type":"ContainerDied","Data":"b500c4e2531bb427b6054b6e96a95e79e24764e2e5d9c67b18125d186db7ea94"} Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.838692 4702 scope.go:117] "RemoveContainer" containerID="38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db" Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.864082 4702 scope.go:117] "RemoveContainer" containerID="38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db" Dec 03 11:23:31 crc kubenswrapper[4702]: E1203 11:23:31.865851 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db\": container with ID starting with 38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db not found: ID does not exist" containerID="38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db" Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.865913 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db"} err="failed to get container status \"38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db\": rpc error: code = NotFound desc = could not find container \"38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db\": container with ID starting with 38ee6f39b6f235410ce8d63e3a0e1be50b38569dc1b84f50f66bdc78731a25db not found: ID does not exist" Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.870069 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5fq6l"] Dec 03 11:23:31 crc kubenswrapper[4702]: I1203 11:23:31.876622 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5fq6l"] Dec 03 11:23:32 crc kubenswrapper[4702]: I1203 11:23:32.940225 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98df982f-4154-487c-b324-d8c63e616127" path="/var/lib/kubelet/pods/98df982f-4154-487c-b324-d8c63e616127/volumes" Dec 03 11:23:39 crc kubenswrapper[4702]: I1203 11:23:39.004748 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:39 crc kubenswrapper[4702]: I1203 11:23:39.005618 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:39 crc kubenswrapper[4702]: I1203 11:23:39.044859 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:39 crc kubenswrapper[4702]: I1203 11:23:39.960451 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.264102 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9"] Dec 03 11:23:45 crc kubenswrapper[4702]: E1203 11:23:45.265143 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98df982f-4154-487c-b324-d8c63e616127" containerName="registry-server" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.265161 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="98df982f-4154-487c-b324-d8c63e616127" containerName="registry-server" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.265395 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="98df982f-4154-487c-b324-d8c63e616127" containerName="registry-server" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.266955 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.273282 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5phnh" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.276609 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9"] Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.403454 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-bundle\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.403548 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-util\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.403599 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jntvs\" (UniqueName: \"kubernetes.io/projected/b452d48a-0cf2-4958-8c23-32ed4d808c7b-kube-api-access-jntvs\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.505227 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-bundle\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.505305 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-util\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.505370 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jntvs\" (UniqueName: \"kubernetes.io/projected/b452d48a-0cf2-4958-8c23-32ed4d808c7b-kube-api-access-jntvs\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.505919 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-bundle\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.506472 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-util\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.537153 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jntvs\" (UniqueName: \"kubernetes.io/projected/b452d48a-0cf2-4958-8c23-32ed4d808c7b-kube-api-access-jntvs\") pod \"389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:45 crc kubenswrapper[4702]: I1203 11:23:45.594173 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:46 crc kubenswrapper[4702]: I1203 11:23:46.098987 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9"] Dec 03 11:23:46 crc kubenswrapper[4702]: W1203 11:23:46.103563 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb452d48a_0cf2_4958_8c23_32ed4d808c7b.slice/crio-56615f4bde166433b9bb893cbf55624b51eb75139cc47e206022f793184d16e1 WatchSource:0}: Error finding container 56615f4bde166433b9bb893cbf55624b51eb75139cc47e206022f793184d16e1: Status 404 returned error can't find the container with id 56615f4bde166433b9bb893cbf55624b51eb75139cc47e206022f793184d16e1 Dec 03 11:23:46 crc kubenswrapper[4702]: I1203 11:23:46.996510 4702 generic.go:334] "Generic (PLEG): container finished" podID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerID="cd4a23774327bc0592418adae88ed630ce9e088b75f21d64ec0c9b2cfe36a931" exitCode=0 Dec 03 11:23:46 crc kubenswrapper[4702]: I1203 11:23:46.996627 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" event={"ID":"b452d48a-0cf2-4958-8c23-32ed4d808c7b","Type":"ContainerDied","Data":"cd4a23774327bc0592418adae88ed630ce9e088b75f21d64ec0c9b2cfe36a931"} Dec 03 11:23:46 crc kubenswrapper[4702]: I1203 11:23:46.996911 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" event={"ID":"b452d48a-0cf2-4958-8c23-32ed4d808c7b","Type":"ContainerStarted","Data":"56615f4bde166433b9bb893cbf55624b51eb75139cc47e206022f793184d16e1"} Dec 03 11:23:48 crc kubenswrapper[4702]: I1203 11:23:48.010357 4702 generic.go:334] "Generic (PLEG): container finished" podID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerID="1200dff21cb947ceed8e7c0d578f0d4cb11316459fb4123ae2aa1d8e9e6a3d91" exitCode=0 Dec 03 11:23:48 crc kubenswrapper[4702]: I1203 11:23:48.010411 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" event={"ID":"b452d48a-0cf2-4958-8c23-32ed4d808c7b","Type":"ContainerDied","Data":"1200dff21cb947ceed8e7c0d578f0d4cb11316459fb4123ae2aa1d8e9e6a3d91"} Dec 03 11:23:49 crc kubenswrapper[4702]: I1203 11:23:49.019952 4702 generic.go:334] "Generic (PLEG): container finished" podID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerID="255344951be83e6929a1e7bb6a34194b12ccfa4ee142f5b68ea4f2361832ebe4" exitCode=0 Dec 03 11:23:49 crc kubenswrapper[4702]: I1203 11:23:49.020279 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" event={"ID":"b452d48a-0cf2-4958-8c23-32ed4d808c7b","Type":"ContainerDied","Data":"255344951be83e6929a1e7bb6a34194b12ccfa4ee142f5b68ea4f2361832ebe4"} Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.377740 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.504837 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-bundle\") pod \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.505054 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jntvs\" (UniqueName: \"kubernetes.io/projected/b452d48a-0cf2-4958-8c23-32ed4d808c7b-kube-api-access-jntvs\") pod \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.505168 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-util\") pod \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\" (UID: \"b452d48a-0cf2-4958-8c23-32ed4d808c7b\") " Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.505886 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-bundle" (OuterVolumeSpecName: "bundle") pod "b452d48a-0cf2-4958-8c23-32ed4d808c7b" (UID: "b452d48a-0cf2-4958-8c23-32ed4d808c7b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.507613 4702 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.511014 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b452d48a-0cf2-4958-8c23-32ed4d808c7b-kube-api-access-jntvs" (OuterVolumeSpecName: "kube-api-access-jntvs") pod "b452d48a-0cf2-4958-8c23-32ed4d808c7b" (UID: "b452d48a-0cf2-4958-8c23-32ed4d808c7b"). InnerVolumeSpecName "kube-api-access-jntvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.519789 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-util" (OuterVolumeSpecName: "util") pod "b452d48a-0cf2-4958-8c23-32ed4d808c7b" (UID: "b452d48a-0cf2-4958-8c23-32ed4d808c7b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.609508 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jntvs\" (UniqueName: \"kubernetes.io/projected/b452d48a-0cf2-4958-8c23-32ed4d808c7b-kube-api-access-jntvs\") on node \"crc\" DevicePath \"\"" Dec 03 11:23:50 crc kubenswrapper[4702]: I1203 11:23:50.609549 4702 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b452d48a-0cf2-4958-8c23-32ed4d808c7b-util\") on node \"crc\" DevicePath \"\"" Dec 03 11:23:51 crc kubenswrapper[4702]: I1203 11:23:51.042361 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" event={"ID":"b452d48a-0cf2-4958-8c23-32ed4d808c7b","Type":"ContainerDied","Data":"56615f4bde166433b9bb893cbf55624b51eb75139cc47e206022f793184d16e1"} Dec 03 11:23:51 crc kubenswrapper[4702]: I1203 11:23:51.042429 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56615f4bde166433b9bb893cbf55624b51eb75139cc47e206022f793184d16e1" Dec 03 11:23:51 crc kubenswrapper[4702]: I1203 11:23:51.042564 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9" Dec 03 11:23:55 crc kubenswrapper[4702]: I1203 11:23:55.912360 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:23:55 crc kubenswrapper[4702]: I1203 11:23:55.913108 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.941345 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92"] Dec 03 11:23:57 crc kubenswrapper[4702]: E1203 11:23:57.942099 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerName="extract" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.942113 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerName="extract" Dec 03 11:23:57 crc kubenswrapper[4702]: E1203 11:23:57.942143 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerName="util" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.942150 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerName="util" Dec 03 11:23:57 crc kubenswrapper[4702]: E1203 11:23:57.942160 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerName="pull" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.942167 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerName="pull" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.942385 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="b452d48a-0cf2-4958-8c23-32ed4d808c7b" containerName="extract" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.943506 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.948218 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-88qv5" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.954448 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfpk2\" (UniqueName: \"kubernetes.io/projected/7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3-kube-api-access-wfpk2\") pod \"openstack-operator-controller-operator-75b4565ff4-4pl92\" (UID: \"7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3\") " pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 11:23:57 crc kubenswrapper[4702]: I1203 11:23:57.959188 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92"] Dec 03 11:23:58 crc kubenswrapper[4702]: I1203 11:23:58.056774 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfpk2\" (UniqueName: \"kubernetes.io/projected/7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3-kube-api-access-wfpk2\") pod \"openstack-operator-controller-operator-75b4565ff4-4pl92\" (UID: \"7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3\") " pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 11:23:58 crc kubenswrapper[4702]: I1203 11:23:58.081536 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfpk2\" (UniqueName: \"kubernetes.io/projected/7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3-kube-api-access-wfpk2\") pod \"openstack-operator-controller-operator-75b4565ff4-4pl92\" (UID: \"7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3\") " pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 11:23:58 crc kubenswrapper[4702]: I1203 11:23:58.276597 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 11:23:58 crc kubenswrapper[4702]: I1203 11:23:58.777987 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92"] Dec 03 11:23:59 crc kubenswrapper[4702]: I1203 11:23:59.125383 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" event={"ID":"7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3","Type":"ContainerStarted","Data":"2a5bb016b750da263562648f6fc3b9649ddbe8e83c49d327fb47056d6e617212"} Dec 03 11:24:06 crc kubenswrapper[4702]: I1203 11:24:06.216330 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" event={"ID":"7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3","Type":"ContainerStarted","Data":"36284dcfeefe99a9bfddd6167ac936637f43293c085fecce5a3fb65e4f6c9a6d"} Dec 03 11:24:06 crc kubenswrapper[4702]: I1203 11:24:06.217149 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 11:24:06 crc kubenswrapper[4702]: I1203 11:24:06.258564 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" podStartSLOduration=2.8516544809999997 podStartE2EDuration="9.258543534s" podCreationTimestamp="2025-12-03 11:23:57 +0000 UTC" firstStartedPulling="2025-12-03 11:23:58.804698125 +0000 UTC m=+1222.640626589" lastFinishedPulling="2025-12-03 11:24:05.211587178 +0000 UTC m=+1229.047515642" observedRunningTime="2025-12-03 11:24:06.2435507 +0000 UTC m=+1230.079479184" watchObservedRunningTime="2025-12-03 11:24:06.258543534 +0000 UTC m=+1230.094471998" Dec 03 11:24:18 crc kubenswrapper[4702]: I1203 11:24:18.279739 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 11:24:25 crc kubenswrapper[4702]: I1203 11:24:25.908120 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:24:25 crc kubenswrapper[4702]: I1203 11:24:25.908993 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.509732 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg"] Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.518042 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.525361 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-ch847" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.528673 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt"] Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.530069 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.536504 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-ktfk2" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.545318 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg"] Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.609331 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt"] Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.747632 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9s4t\" (UniqueName: \"kubernetes.io/projected/182ca1cb-9499-4cf7-aeae-c35c7038814c-kube-api-access-r9s4t\") pod \"cinder-operator-controller-manager-859b6ccc6-w2vmt\" (UID: \"182ca1cb-9499-4cf7-aeae-c35c7038814c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.747857 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l497k\" (UniqueName: \"kubernetes.io/projected/8f6320ff-4661-46be-80e1-8d97f09fe789-kube-api-access-l497k\") pod \"barbican-operator-controller-manager-7d9dfd778-m5trg\" (UID: \"8f6320ff-4661-46be-80e1-8d97f09fe789\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.834873 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz"] Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.836897 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.842685 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-24djk" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.849582 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l497k\" (UniqueName: \"kubernetes.io/projected/8f6320ff-4661-46be-80e1-8d97f09fe789-kube-api-access-l497k\") pod \"barbican-operator-controller-manager-7d9dfd778-m5trg\" (UID: \"8f6320ff-4661-46be-80e1-8d97f09fe789\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.849971 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9s4t\" (UniqueName: \"kubernetes.io/projected/182ca1cb-9499-4cf7-aeae-c35c7038814c-kube-api-access-r9s4t\") pod \"cinder-operator-controller-manager-859b6ccc6-w2vmt\" (UID: \"182ca1cb-9499-4cf7-aeae-c35c7038814c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.894395 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l497k\" (UniqueName: \"kubernetes.io/projected/8f6320ff-4661-46be-80e1-8d97f09fe789-kube-api-access-l497k\") pod \"barbican-operator-controller-manager-7d9dfd778-m5trg\" (UID: \"8f6320ff-4661-46be-80e1-8d97f09fe789\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.895897 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c"] Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.897615 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.903422 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-6xx5b" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.915537 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9s4t\" (UniqueName: \"kubernetes.io/projected/182ca1cb-9499-4cf7-aeae-c35c7038814c-kube-api-access-r9s4t\") pod \"cinder-operator-controller-manager-859b6ccc6-w2vmt\" (UID: \"182ca1cb-9499-4cf7-aeae-c35c7038814c\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.929874 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz"] Dec 03 11:24:37 crc kubenswrapper[4702]: I1203 11:24:37.951419 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td7jl\" (UniqueName: \"kubernetes.io/projected/530ef793-9485-4c45-86ba-531906f2085a-kube-api-access-td7jl\") pod \"designate-operator-controller-manager-78b4bc895b-htxmz\" (UID: \"530ef793-9485-4c45-86ba-531906f2085a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:37.992237 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:37.994665 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:37.997239 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6cn5v" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.012441 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.012508 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.021224 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.023367 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.027024 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-w5b84" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.039493 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-98mxd"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.041229 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.046202 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-f48jn" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.046296 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.046418 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.047823 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.049677 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9llzh" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.053913 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vltk8\" (UniqueName: \"kubernetes.io/projected/62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d-kube-api-access-vltk8\") pod \"glance-operator-controller-manager-77987cd8cd-lp88c\" (UID: \"62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.054140 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td7jl\" (UniqueName: \"kubernetes.io/projected/530ef793-9485-4c45-86ba-531906f2085a-kube-api-access-td7jl\") pod \"designate-operator-controller-manager-78b4bc895b-htxmz\" (UID: \"530ef793-9485-4c45-86ba-531906f2085a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.055794 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.077524 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-98mxd"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.090561 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td7jl\" (UniqueName: \"kubernetes.io/projected/530ef793-9485-4c45-86ba-531906f2085a-kube-api-access-td7jl\") pod \"designate-operator-controller-manager-78b4bc895b-htxmz\" (UID: \"530ef793-9485-4c45-86ba-531906f2085a\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.091027 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.094253 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.095828 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.107408 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.116587 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-bjnb4" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.151236 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.154393 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.156129 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vltk8\" (UniqueName: \"kubernetes.io/projected/62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d-kube-api-access-vltk8\") pod \"glance-operator-controller-manager-77987cd8cd-lp88c\" (UID: \"62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.158087 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-9s7ks" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.167084 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.183994 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.245611 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.262211 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hqhw\" (UniqueName: \"kubernetes.io/projected/a7faac4b-b558-4106-af27-4daf6a1db1af-kube-api-access-5hqhw\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.262266 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfghx\" (UniqueName: \"kubernetes.io/projected/4b90477f-d1b5-4f03-ab08-2476d44a9cff-kube-api-access-cfghx\") pod \"horizon-operator-controller-manager-68c6d99b8f-kg6p7\" (UID: \"4b90477f-d1b5-4f03-ab08-2476d44a9cff\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.262307 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mhk8\" (UniqueName: \"kubernetes.io/projected/224e5de0-3f58-4243-80e5-212cf016ea46-kube-api-access-4mhk8\") pod \"heat-operator-controller-manager-5f64f6f8bb-4pkkr\" (UID: \"224e5de0-3f58-4243-80e5-212cf016ea46\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.262359 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpthf\" (UniqueName: \"kubernetes.io/projected/1a7e4f08-8a48-44d5-944b-4eaf9d9518b5-kube-api-access-mpthf\") pod \"ironic-operator-controller-manager-6c548fd776-gqqgw\" (UID: \"1a7e4f08-8a48-44d5-944b-4eaf9d9518b5\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.262440 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl47f\" (UniqueName: \"kubernetes.io/projected/9b295e92-630f-4544-b741-50ece5e79f4c-kube-api-access-hl47f\") pod \"keystone-operator-controller-manager-7765d96ddf-hpf6t\" (UID: \"9b295e92-630f-4544-b741-50ece5e79f4c\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.262479 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.262571 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkwnz\" (UniqueName: \"kubernetes.io/projected/e3c1b694-60b8-4b5d-b8d5-40418e60aa4b-kube-api-access-bkwnz\") pod \"manila-operator-controller-manager-7c79b5df47-hxvr6\" (UID: \"e3c1b694-60b8-4b5d-b8d5-40418e60aa4b\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.277922 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.280972 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.297245 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.301749 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-dxjst" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.312860 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vltk8\" (UniqueName: \"kubernetes.io/projected/62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d-kube-api-access-vltk8\") pod \"glance-operator-controller-manager-77987cd8cd-lp88c\" (UID: \"62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.323103 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.335002 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.349083 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-zbnkm" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.364726 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpthf\" (UniqueName: \"kubernetes.io/projected/1a7e4f08-8a48-44d5-944b-4eaf9d9518b5-kube-api-access-mpthf\") pod \"ironic-operator-controller-manager-6c548fd776-gqqgw\" (UID: \"1a7e4f08-8a48-44d5-944b-4eaf9d9518b5\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.364885 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsnmb\" (UniqueName: \"kubernetes.io/projected/84fc908a-9418-4e6e-ac17-9e725524f9ce-kube-api-access-tsnmb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vz7gf\" (UID: \"84fc908a-9418-4e6e-ac17-9e725524f9ce\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.364982 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl47f\" (UniqueName: \"kubernetes.io/projected/9b295e92-630f-4544-b741-50ece5e79f4c-kube-api-access-hl47f\") pod \"keystone-operator-controller-manager-7765d96ddf-hpf6t\" (UID: \"9b295e92-630f-4544-b741-50ece5e79f4c\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.365047 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.365128 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkwnz\" (UniqueName: \"kubernetes.io/projected/e3c1b694-60b8-4b5d-b8d5-40418e60aa4b-kube-api-access-bkwnz\") pod \"manila-operator-controller-manager-7c79b5df47-hxvr6\" (UID: \"e3c1b694-60b8-4b5d-b8d5-40418e60aa4b\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.365188 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hqhw\" (UniqueName: \"kubernetes.io/projected/a7faac4b-b558-4106-af27-4daf6a1db1af-kube-api-access-5hqhw\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.365222 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfghx\" (UniqueName: \"kubernetes.io/projected/4b90477f-d1b5-4f03-ab08-2476d44a9cff-kube-api-access-cfghx\") pod \"horizon-operator-controller-manager-68c6d99b8f-kg6p7\" (UID: \"4b90477f-d1b5-4f03-ab08-2476d44a9cff\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.365270 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-759v8\" (UniqueName: \"kubernetes.io/projected/5cecb29f-7ef9-4177-8e01-a776b70bbb03-kube-api-access-759v8\") pod \"mariadb-operator-controller-manager-56bbcc9d85-2pcqv\" (UID: \"5cecb29f-7ef9-4177-8e01-a776b70bbb03\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.365297 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mhk8\" (UniqueName: \"kubernetes.io/projected/224e5de0-3f58-4243-80e5-212cf016ea46-kube-api-access-4mhk8\") pod \"heat-operator-controller-manager-5f64f6f8bb-4pkkr\" (UID: \"224e5de0-3f58-4243-80e5-212cf016ea46\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 11:24:38 crc kubenswrapper[4702]: E1203 11:24:38.366153 4702 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:38 crc kubenswrapper[4702]: E1203 11:24:38.366244 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert podName:a7faac4b-b558-4106-af27-4daf6a1db1af nodeName:}" failed. No retries permitted until 2025-12-03 11:24:38.86621022 +0000 UTC m=+1262.702138684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert") pod "infra-operator-controller-manager-57548d458d-98mxd" (UID: "a7faac4b-b558-4106-af27-4daf6a1db1af") : secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.371535 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.412565 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkwnz\" (UniqueName: \"kubernetes.io/projected/e3c1b694-60b8-4b5d-b8d5-40418e60aa4b-kube-api-access-bkwnz\") pod \"manila-operator-controller-manager-7c79b5df47-hxvr6\" (UID: \"e3c1b694-60b8-4b5d-b8d5-40418e60aa4b\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.459483 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.460870 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.462675 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mhk8\" (UniqueName: \"kubernetes.io/projected/224e5de0-3f58-4243-80e5-212cf016ea46-kube-api-access-4mhk8\") pod \"heat-operator-controller-manager-5f64f6f8bb-4pkkr\" (UID: \"224e5de0-3f58-4243-80e5-212cf016ea46\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.469874 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpthf\" (UniqueName: \"kubernetes.io/projected/1a7e4f08-8a48-44d5-944b-4eaf9d9518b5-kube-api-access-mpthf\") pod \"ironic-operator-controller-manager-6c548fd776-gqqgw\" (UID: \"1a7e4f08-8a48-44d5-944b-4eaf9d9518b5\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.470999 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hqhw\" (UniqueName: \"kubernetes.io/projected/a7faac4b-b558-4106-af27-4daf6a1db1af-kube-api-access-5hqhw\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.568620 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-759v8\" (UniqueName: \"kubernetes.io/projected/5cecb29f-7ef9-4177-8e01-a776b70bbb03-kube-api-access-759v8\") pod \"mariadb-operator-controller-manager-56bbcc9d85-2pcqv\" (UID: \"5cecb29f-7ef9-4177-8e01-a776b70bbb03\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.568739 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsnmb\" (UniqueName: \"kubernetes.io/projected/84fc908a-9418-4e6e-ac17-9e725524f9ce-kube-api-access-tsnmb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vz7gf\" (UID: \"84fc908a-9418-4e6e-ac17-9e725524f9ce\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.572533 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.574861 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.609050 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.611531 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.617951 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.620177 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.624123 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl47f\" (UniqueName: \"kubernetes.io/projected/9b295e92-630f-4544-b741-50ece5e79f4c-kube-api-access-hl47f\") pod \"keystone-operator-controller-manager-7765d96ddf-hpf6t\" (UID: \"9b295e92-630f-4544-b741-50ece5e79f4c\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.627113 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfghx\" (UniqueName: \"kubernetes.io/projected/4b90477f-d1b5-4f03-ab08-2476d44a9cff-kube-api-access-cfghx\") pod \"horizon-operator-controller-manager-68c6d99b8f-kg6p7\" (UID: \"4b90477f-d1b5-4f03-ab08-2476d44a9cff\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.634154 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.644221 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.659110 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.663575 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.671612 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmf62\" (UniqueName: \"kubernetes.io/projected/523c06cc-9816-4252-ac00-dc7928dae009-kube-api-access-qmf62\") pod \"octavia-operator-controller-manager-998648c74-7xg4t\" (UID: \"523c06cc-9816-4252-ac00-dc7928dae009\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.672030 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fj2tn\" (UniqueName: \"kubernetes.io/projected/5e7b4134-2b34-4b36-ad61-8e681df197df-kube-api-access-fj2tn\") pod \"nova-operator-controller-manager-697bc559fc-m2bfb\" (UID: \"5e7b4134-2b34-4b36-ad61-8e681df197df\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.790871 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.799522 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmf62\" (UniqueName: \"kubernetes.io/projected/523c06cc-9816-4252-ac00-dc7928dae009-kube-api-access-qmf62\") pod \"octavia-operator-controller-manager-998648c74-7xg4t\" (UID: \"523c06cc-9816-4252-ac00-dc7928dae009\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.799908 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fj2tn\" (UniqueName: \"kubernetes.io/projected/5e7b4134-2b34-4b36-ad61-8e681df197df-kube-api-access-fj2tn\") pod \"nova-operator-controller-manager-697bc559fc-m2bfb\" (UID: \"5e7b4134-2b34-4b36-ad61-8e681df197df\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.852252 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.855126 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.861269 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.863531 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.875542 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.888538 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-ntzds"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.890769 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.903488 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx6vj\" (UniqueName: \"kubernetes.io/projected/1d60d4ab-7bac-4fd1-9aad-c07ba1513d41-kube-api-access-lx6vj\") pod \"ovn-operator-controller-manager-b6456fdb6-t27c4\" (UID: \"1d60d4ab-7bac-4fd1-9aad-c07ba1513d41\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.903709 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.903850 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt5sx\" (UniqueName: \"kubernetes.io/projected/1d86df9d-86a7-4980-abd0-488d98f6b2fb-kube-api-access-vt5sx\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.903922 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd2zz\" (UniqueName: \"kubernetes.io/projected/8de75640-5551-4d04-830d-64f0fbb7847a-kube-api-access-dd2zz\") pod \"placement-operator-controller-manager-78f8948974-ntzds\" (UID: \"8de75640-5551-4d04-830d-64f0fbb7847a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.903991 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:38 crc kubenswrapper[4702]: E1203 11:24:38.904295 4702 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:38 crc kubenswrapper[4702]: E1203 11:24:38.904427 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert podName:a7faac4b-b558-4106-af27-4daf6a1db1af nodeName:}" failed. No retries permitted until 2025-12-03 11:24:39.904406383 +0000 UTC m=+1263.740334917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert") pod "infra-operator-controller-manager-57548d458d-98mxd" (UID: "a7faac4b-b558-4106-af27-4daf6a1db1af") : secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.907025 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp"] Dec 03 11:24:38 crc kubenswrapper[4702]: I1203 11:24:38.908805 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.006266 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx6vj\" (UniqueName: \"kubernetes.io/projected/1d60d4ab-7bac-4fd1-9aad-c07ba1513d41-kube-api-access-lx6vj\") pod \"ovn-operator-controller-manager-b6456fdb6-t27c4\" (UID: \"1d60d4ab-7bac-4fd1-9aad-c07ba1513d41\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.006626 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt5sx\" (UniqueName: \"kubernetes.io/projected/1d86df9d-86a7-4980-abd0-488d98f6b2fb-kube-api-access-vt5sx\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.006667 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd2zz\" (UniqueName: \"kubernetes.io/projected/8de75640-5551-4d04-830d-64f0fbb7847a-kube-api-access-dd2zz\") pod \"placement-operator-controller-manager-78f8948974-ntzds\" (UID: \"8de75640-5551-4d04-830d-64f0fbb7847a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.006723 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.006880 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vds2r\" (UniqueName: \"kubernetes.io/projected/b6faaca6-f017-42ac-95e4-d73ae3e8e519-kube-api-access-vds2r\") pod \"swift-operator-controller-manager-5f8c65bbfc-psnhp\" (UID: \"b6faaca6-f017-42ac-95e4-d73ae3e8e519\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.132375 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7vkvv" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.133020 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.133159 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-d8nvw" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.133300 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gqrsk" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.133418 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-25942" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.140479 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vds2r\" (UniqueName: \"kubernetes.io/projected/b6faaca6-f017-42ac-95e4-d73ae3e8e519-kube-api-access-vds2r\") pod \"swift-operator-controller-manager-5f8c65bbfc-psnhp\" (UID: \"b6faaca6-f017-42ac-95e4-d73ae3e8e519\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 11:24:39 crc kubenswrapper[4702]: E1203 11:24:39.150239 4702 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:39 crc kubenswrapper[4702]: E1203 11:24:39.150347 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert podName:1d86df9d-86a7-4980-abd0-488d98f6b2fb nodeName:}" failed. No retries permitted until 2025-12-03 11:24:39.650322393 +0000 UTC m=+1263.486250867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" (UID: "1d86df9d-86a7-4980-abd0-488d98f6b2fb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.154254 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2b4b7" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.154463 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-6cllk" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.352013 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fj2tn\" (UniqueName: \"kubernetes.io/projected/5e7b4134-2b34-4b36-ad61-8e681df197df-kube-api-access-fj2tn\") pod \"nova-operator-controller-manager-697bc559fc-m2bfb\" (UID: \"5e7b4134-2b34-4b36-ad61-8e681df197df\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.372582 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-ntzds"] Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.372633 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9"] Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.393689 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd2zz\" (UniqueName: \"kubernetes.io/projected/8de75640-5551-4d04-830d-64f0fbb7847a-kube-api-access-dd2zz\") pod \"placement-operator-controller-manager-78f8948974-ntzds\" (UID: \"8de75640-5551-4d04-830d-64f0fbb7847a\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.395611 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt5sx\" (UniqueName: \"kubernetes.io/projected/1d86df9d-86a7-4980-abd0-488d98f6b2fb-kube-api-access-vt5sx\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.407160 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmf62\" (UniqueName: \"kubernetes.io/projected/523c06cc-9816-4252-ac00-dc7928dae009-kube-api-access-qmf62\") pod \"octavia-operator-controller-manager-998648c74-7xg4t\" (UID: \"523c06cc-9816-4252-ac00-dc7928dae009\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.408642 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx6vj\" (UniqueName: \"kubernetes.io/projected/1d60d4ab-7bac-4fd1-9aad-c07ba1513d41-kube-api-access-lx6vj\") pod \"ovn-operator-controller-manager-b6456fdb6-t27c4\" (UID: \"1d60d4ab-7bac-4fd1-9aad-c07ba1513d41\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.416453 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-759v8\" (UniqueName: \"kubernetes.io/projected/5cecb29f-7ef9-4177-8e01-a776b70bbb03-kube-api-access-759v8\") pod \"mariadb-operator-controller-manager-56bbcc9d85-2pcqv\" (UID: \"5cecb29f-7ef9-4177-8e01-a776b70bbb03\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.435469 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp"] Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.495036 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vds2r\" (UniqueName: \"kubernetes.io/projected/b6faaca6-f017-42ac-95e4-d73ae3e8e519-kube-api-access-vds2r\") pod \"swift-operator-controller-manager-5f8c65bbfc-psnhp\" (UID: \"b6faaca6-f017-42ac-95e4-d73ae3e8e519\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.505747 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.506591 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.508156 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.523553 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n"] Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.524646 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsnmb\" (UniqueName: \"kubernetes.io/projected/84fc908a-9418-4e6e-ac17-9e725524f9ce-kube-api-access-tsnmb\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-vz7gf\" (UID: \"84fc908a-9418-4e6e-ac17-9e725524f9ce\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.526179 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.548318 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-z82dm" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.556414 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.572224 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.669837 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n"] Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.694789 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn"] Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.695506 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.696415 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.699152 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq"] Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.700417 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.704710 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9mhjt" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.710581 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbmhw\" (UniqueName: \"kubernetes.io/projected/5edf270b-74cb-42d2-82dc-7953f243c6dc-kube-api-access-nbmhw\") pod \"telemetry-operator-controller-manager-f8bdcbf7f-4tp6n\" (UID: \"5edf270b-74cb-42d2-82dc-7953f243c6dc\") " pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.710634 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:39 crc kubenswrapper[4702]: E1203 11:24:39.710875 4702 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:39 crc kubenswrapper[4702]: E1203 11:24:39.710931 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert podName:1d86df9d-86a7-4980-abd0-488d98f6b2fb nodeName:}" failed. No retries permitted until 2025-12-03 11:24:40.710913667 +0000 UTC m=+1264.546842131 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" (UID: "1d86df9d-86a7-4980-abd0-488d98f6b2fb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.717467 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-qpw6z" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.726216 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.814568 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2qvk\" (UniqueName: \"kubernetes.io/projected/ae6dac10-29ba-4bb8-8a0c-68a2bad519af-kube-api-access-f2qvk\") pod \"watcher-operator-controller-manager-769dc69bc-xlpkq\" (UID: \"ae6dac10-29ba-4bb8-8a0c-68a2bad519af\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.814718 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgcth\" (UniqueName: \"kubernetes.io/projected/afc37ae6-c944-4cb1-81b6-c810ea1c3b31-kube-api-access-vgcth\") pod \"test-operator-controller-manager-5854674fcc-nj4tn\" (UID: \"afc37ae6-c944-4cb1-81b6-c810ea1c3b31\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.814786 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbmhw\" (UniqueName: \"kubernetes.io/projected/5edf270b-74cb-42d2-82dc-7953f243c6dc-kube-api-access-nbmhw\") pod \"telemetry-operator-controller-manager-f8bdcbf7f-4tp6n\" (UID: \"5edf270b-74cb-42d2-82dc-7953f243c6dc\") " pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.819991 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq"] Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.915263 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbmhw\" (UniqueName: \"kubernetes.io/projected/5edf270b-74cb-42d2-82dc-7953f243c6dc-kube-api-access-nbmhw\") pod \"telemetry-operator-controller-manager-f8bdcbf7f-4tp6n\" (UID: \"5edf270b-74cb-42d2-82dc-7953f243c6dc\") " pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.929093 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.929159 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgcth\" (UniqueName: \"kubernetes.io/projected/afc37ae6-c944-4cb1-81b6-c810ea1c3b31-kube-api-access-vgcth\") pod \"test-operator-controller-manager-5854674fcc-nj4tn\" (UID: \"afc37ae6-c944-4cb1-81b6-c810ea1c3b31\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.929299 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2qvk\" (UniqueName: \"kubernetes.io/projected/ae6dac10-29ba-4bb8-8a0c-68a2bad519af-kube-api-access-f2qvk\") pod \"watcher-operator-controller-manager-769dc69bc-xlpkq\" (UID: \"ae6dac10-29ba-4bb8-8a0c-68a2bad519af\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 11:24:39 crc kubenswrapper[4702]: E1203 11:24:39.929856 4702 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:39 crc kubenswrapper[4702]: E1203 11:24:39.929909 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert podName:a7faac4b-b558-4106-af27-4daf6a1db1af nodeName:}" failed. No retries permitted until 2025-12-03 11:24:41.929891427 +0000 UTC m=+1265.765819891 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert") pod "infra-operator-controller-manager-57548d458d-98mxd" (UID: "a7faac4b-b558-4106-af27-4daf6a1db1af") : secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.933094 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 11:24:39 crc kubenswrapper[4702]: I1203 11:24:39.995880 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn"] Dec 03 11:24:40 crc kubenswrapper[4702]: I1203 11:24:40.018616 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgcth\" (UniqueName: \"kubernetes.io/projected/afc37ae6-c944-4cb1-81b6-c810ea1c3b31-kube-api-access-vgcth\") pod \"test-operator-controller-manager-5854674fcc-nj4tn\" (UID: \"afc37ae6-c944-4cb1-81b6-c810ea1c3b31\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 11:24:40 crc kubenswrapper[4702]: I1203 11:24:40.029126 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2qvk\" (UniqueName: \"kubernetes.io/projected/ae6dac10-29ba-4bb8-8a0c-68a2bad519af-kube-api-access-f2qvk\") pod \"watcher-operator-controller-manager-769dc69bc-xlpkq\" (UID: \"ae6dac10-29ba-4bb8-8a0c-68a2bad519af\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 11:24:40 crc kubenswrapper[4702]: I1203 11:24:40.082319 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 11:24:40 crc kubenswrapper[4702]: I1203 11:24:40.127094 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.300711 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.301266 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:42 crc kubenswrapper[4702]: E1203 11:24:42.315749 4702 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:42 crc kubenswrapper[4702]: E1203 11:24:42.315865 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert podName:a7faac4b-b558-4106-af27-4daf6a1db1af nodeName:}" failed. No retries permitted until 2025-12-03 11:24:46.315834655 +0000 UTC m=+1270.151763129 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert") pod "infra-operator-controller-manager-57548d458d-98mxd" (UID: "a7faac4b-b558-4106-af27-4daf6a1db1af") : secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:42 crc kubenswrapper[4702]: E1203 11:24:42.316484 4702 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:42 crc kubenswrapper[4702]: E1203 11:24:42.316531 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert podName:1d86df9d-86a7-4980-abd0-488d98f6b2fb nodeName:}" failed. No retries permitted until 2025-12-03 11:24:44.316520135 +0000 UTC m=+1268.152448599 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" (UID: "1d86df9d-86a7-4980-abd0-488d98f6b2fb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.328649 4702 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-bhqrp container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.328791 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podUID="042cc406-7960-493a-a19a-cb5590f8ff1f" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.441401 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.445228 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.448880 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.449840 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.450381 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7jr2c" Dec 03 11:24:42 crc kubenswrapper[4702]: W1203 11:24:42.451505 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a7e4f08_8a48_44d5_944b_4eaf9d9518b5.slice/crio-27a31e306caac0f48475d5541e0630fa33b122ebec93dfb9fa7f928483891f81 WatchSource:0}: Error finding container 27a31e306caac0f48475d5541e0630fa33b122ebec93dfb9fa7f928483891f81: Status 404 returned error can't find the container with id 27a31e306caac0f48475d5541e0630fa33b122ebec93dfb9fa7f928483891f81 Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.455314 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" event={"ID":"530ef793-9485-4c45-86ba-531906f2085a","Type":"ContainerStarted","Data":"9a714076feeea318216731722b0290fe7382669a48e3ff59c62da67480c39e06"} Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.480005 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.482521 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" event={"ID":"9b295e92-630f-4544-b741-50ece5e79f4c","Type":"ContainerStarted","Data":"a2fc0f386a4778969ee481c9cecfd22ed7f84cf6c7bfe93430d2223128e44829"} Dec 03 11:24:42 crc kubenswrapper[4702]: W1203 11:24:42.484133 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cecb29f_7ef9_4177_8e01_a776b70bbb03.slice/crio-47052a030248580d852cbfdc5361f11719088cc49d3cc97e383cff4a0c668784 WatchSource:0}: Error finding container 47052a030248580d852cbfdc5361f11719088cc49d3cc97e383cff4a0c668784: Status 404 returned error can't find the container with id 47052a030248580d852cbfdc5361f11719088cc49d3cc97e383cff4a0c668784 Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.492541 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" event={"ID":"182ca1cb-9499-4cf7-aeae-c35c7038814c","Type":"ContainerStarted","Data":"dfbeecc5b1b0f1d1844421036e41def90185490ee4f42eda5483535998155d17"} Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.515173 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" event={"ID":"8f6320ff-4661-46be-80e1-8d97f09fe789","Type":"ContainerStarted","Data":"10ae0dcbb7ff6f830490fb2363fda3d9ba475dfb1686810f481fe617d208c7b8"} Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.523332 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" event={"ID":"e3c1b694-60b8-4b5d-b8d5-40418e60aa4b","Type":"ContainerStarted","Data":"09428842f45af7630a4ad119ec70fe5afe03f32056aa16996b2a8e35142a2c2d"} Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.558076 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.568169 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.571215 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-pdh69" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.584074 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.604919 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.610478 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.611834 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.611931 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f29br\" (UniqueName: \"kubernetes.io/projected/b877c7a7-0b88-4238-8a21-314ef1525996-kube-api-access-f29br\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.611968 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.618729 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.627933 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.637097 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.647917 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.657057 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.667108 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.678341 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.687851 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.693160 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv"] Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.713879 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.714033 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tfnb\" (UniqueName: \"kubernetes.io/projected/c43c86a0-692f-406f-871a-24a14f24ed77-kube-api-access-4tfnb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ckjgv\" (UID: \"c43c86a0-692f-406f-871a-24a14f24ed77\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.714101 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:42 crc kubenswrapper[4702]: E1203 11:24:42.714147 4702 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.714170 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f29br\" (UniqueName: \"kubernetes.io/projected/b877c7a7-0b88-4238-8a21-314ef1525996-kube-api-access-f29br\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:42 crc kubenswrapper[4702]: E1203 11:24:42.714238 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:43.214213755 +0000 UTC m=+1267.050142289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "webhook-server-cert" not found Dec 03 11:24:42 crc kubenswrapper[4702]: E1203 11:24:42.714495 4702 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 11:24:42 crc kubenswrapper[4702]: E1203 11:24:42.714549 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:43.214530334 +0000 UTC m=+1267.050458888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "metrics-server-cert" not found Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.738215 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f29br\" (UniqueName: \"kubernetes.io/projected/b877c7a7-0b88-4238-8a21-314ef1525996-kube-api-access-f29br\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.816502 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tfnb\" (UniqueName: \"kubernetes.io/projected/c43c86a0-692f-406f-871a-24a14f24ed77-kube-api-access-4tfnb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ckjgv\" (UID: \"c43c86a0-692f-406f-871a-24a14f24ed77\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.849721 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tfnb\" (UniqueName: \"kubernetes.io/projected/c43c86a0-692f-406f-871a-24a14f24ed77-kube-api-access-4tfnb\") pod \"rabbitmq-cluster-operator-manager-668c99d594-ckjgv\" (UID: \"c43c86a0-692f-406f-871a-24a14f24ed77\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" Dec 03 11:24:42 crc kubenswrapper[4702]: I1203 11:24:42.897528 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.013418 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t"] Dec 03 11:24:43 crc kubenswrapper[4702]: W1203 11:24:43.026225 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod523c06cc_9816_4252_ac00_dc7928dae009.slice/crio-11d737678f5b1b04f41a156726e92195a956a15dc7fd4a2bcc8bc6d951689904 WatchSource:0}: Error finding container 11d737678f5b1b04f41a156726e92195a956a15dc7fd4a2bcc8bc6d951689904: Status 404 returned error can't find the container with id 11d737678f5b1b04f41a156726e92195a956a15dc7fd4a2bcc8bc6d951689904 Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.034328 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp"] Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.097836 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n"] Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.123881 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf"] Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.137804 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq"] Dec 03 11:24:43 crc kubenswrapper[4702]: W1203 11:24:43.148592 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-6e08fe658f75d8ab2a7b2f15c905b80fb99f72640995121a415fbd32a3505851 WatchSource:0}: Error finding container 6e08fe658f75d8ab2a7b2f15c905b80fb99f72640995121a415fbd32a3505851: Status 404 returned error can't find the container with id 6e08fe658f75d8ab2a7b2f15c905b80fb99f72640995121a415fbd32a3505851 Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.181004 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4"] Dec 03 11:24:43 crc kubenswrapper[4702]: W1203 11:24:43.200893 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6faaca6_f017_42ac_95e4_d73ae3e8e519.slice/crio-15bdba6045f0cfe5390b74b7574f0819de3754748eb1e595b74c18fe74bb800f WatchSource:0}: Error finding container 15bdba6045f0cfe5390b74b7574f0819de3754748eb1e595b74c18fe74bb800f: Status 404 returned error can't find the container with id 15bdba6045f0cfe5390b74b7574f0819de3754748eb1e595b74c18fe74bb800f Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.217792 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn"] Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.227964 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.228105 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.228318 4702 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.228364 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:44.228349686 +0000 UTC m=+1268.064278150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "metrics-server-cert" not found Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.228319 4702 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.228395 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:44.228389357 +0000 UTC m=+1268.064317821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "webhook-server-cert" not found Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.228312 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f2qvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-xlpkq_openstack-operators(ae6dac10-29ba-4bb8-8a0c-68a2bad519af): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.239566 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f2qvk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-xlpkq_openstack-operators(ae6dac10-29ba-4bb8-8a0c-68a2bad519af): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.240402 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-ntzds"] Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.241234 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podUID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.244137 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lx6vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-t27c4_openstack-operators(1d60d4ab-7bac-4fd1-9aad-c07ba1513d41): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.255994 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lx6vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-t27c4_openstack-operators(1d60d4ab-7bac-4fd1-9aad-c07ba1513d41): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.257427 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.260920 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dd2zz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-ntzds_openstack-operators(8de75640-5551-4d04-830d-64f0fbb7847a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.269010 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgcth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-nj4tn_openstack-operators(afc37ae6-c944-4cb1-81b6-c810ea1c3b31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.269114 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dd2zz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-ntzds_openstack-operators(8de75640-5551-4d04-830d-64f0fbb7847a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.273610 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.279364 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgcth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-nj4tn_openstack-operators(afc37ae6-c944-4cb1-81b6-c810ea1c3b31): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.285123 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.552102 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" event={"ID":"8de75640-5551-4d04-830d-64f0fbb7847a","Type":"ContainerStarted","Data":"969f90800c1e902c05007fe4dd0461512b51ed7898f656f1b2273cfbf6220f0d"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.554678 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" event={"ID":"afc37ae6-c944-4cb1-81b6-c810ea1c3b31","Type":"ContainerStarted","Data":"2abddb6affe45c05763ec57655b59fd25e7ece77b4b919d25774fd29513e4e4b"} Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.559175 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.564315 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.571299 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" event={"ID":"4b90477f-d1b5-4f03-ab08-2476d44a9cff","Type":"ContainerStarted","Data":"89db01b2d73aadbbc2df8668c8b9f8667c279e4183d8870fcc371c296e5ecb7c"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.600291 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" event={"ID":"5edf270b-74cb-42d2-82dc-7953f243c6dc","Type":"ContainerStarted","Data":"c06f09792c83997ca08b3f5bb3fa279f110f50b9e244b4c98c72d6a0bdb03c5d"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.606955 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" event={"ID":"1d60d4ab-7bac-4fd1-9aad-c07ba1513d41","Type":"ContainerStarted","Data":"ca583fc366a9366838832da566c9367396aed954c9c8266a0de07841de21471f"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.614540 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" event={"ID":"5cecb29f-7ef9-4177-8e01-a776b70bbb03","Type":"ContainerStarted","Data":"47052a030248580d852cbfdc5361f11719088cc49d3cc97e383cff4a0c668784"} Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.628734 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.628736 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" event={"ID":"62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d","Type":"ContainerStarted","Data":"73b165e686635ad1bea622d106ce77f8e7a2be654bd828125a022e4272774a63"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.641051 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" event={"ID":"523c06cc-9816-4252-ac00-dc7928dae009","Type":"ContainerStarted","Data":"11d737678f5b1b04f41a156726e92195a956a15dc7fd4a2bcc8bc6d951689904"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.648533 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" event={"ID":"5e7b4134-2b34-4b36-ad61-8e681df197df","Type":"ContainerStarted","Data":"89f5518ca92e3461adb97c51fe90a85ac7eca3a63c7a86a49d08bbbc6a26c766"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.658824 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" event={"ID":"ae6dac10-29ba-4bb8-8a0c-68a2bad519af","Type":"ContainerStarted","Data":"60a7265dafb2b9979f408b964df90a9dd3e71f9823e2ba1f4f529c18a17feb6a"} Dec 03 11:24:43 crc kubenswrapper[4702]: E1203 11:24:43.665783 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podUID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.667612 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" event={"ID":"1a7e4f08-8a48-44d5-944b-4eaf9d9518b5","Type":"ContainerStarted","Data":"27a31e306caac0f48475d5541e0630fa33b122ebec93dfb9fa7f928483891f81"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.682473 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" event={"ID":"b6faaca6-f017-42ac-95e4-d73ae3e8e519","Type":"ContainerStarted","Data":"15bdba6045f0cfe5390b74b7574f0819de3754748eb1e595b74c18fe74bb800f"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.688982 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" event={"ID":"84fc908a-9418-4e6e-ac17-9e725524f9ce","Type":"ContainerStarted","Data":"6e08fe658f75d8ab2a7b2f15c905b80fb99f72640995121a415fbd32a3505851"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.701283 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" event={"ID":"224e5de0-3f58-4243-80e5-212cf016ea46","Type":"ContainerStarted","Data":"6a68a755aaf8c94990d66326f5293f0ff47d477c0516556246511a1905bf36ec"} Dec 03 11:24:43 crc kubenswrapper[4702]: I1203 11:24:43.756339 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv"] Dec 03 11:24:43 crc kubenswrapper[4702]: W1203 11:24:43.773782 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc43c86a0_692f_406f_871a_24a14f24ed77.slice/crio-9aefe7188be3b31b235aec167dd6ef6476a56c6b6bda7c494133a07d2a5dcc76 WatchSource:0}: Error finding container 9aefe7188be3b31b235aec167dd6ef6476a56c6b6bda7c494133a07d2a5dcc76: Status 404 returned error can't find the container with id 9aefe7188be3b31b235aec167dd6ef6476a56c6b6bda7c494133a07d2a5dcc76 Dec 03 11:24:44 crc kubenswrapper[4702]: I1203 11:24:44.253914 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:44 crc kubenswrapper[4702]: I1203 11:24:44.254113 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.254168 4702 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.254261 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:46.254243693 +0000 UTC m=+1270.090172147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "webhook-server-cert" not found Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.254380 4702 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.254453 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:46.254432388 +0000 UTC m=+1270.090360852 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "metrics-server-cert" not found Dec 03 11:24:44 crc kubenswrapper[4702]: I1203 11:24:44.357558 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.358229 4702 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.358409 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert podName:1d86df9d-86a7-4980-abd0-488d98f6b2fb nodeName:}" failed. No retries permitted until 2025-12-03 11:24:48.358387387 +0000 UTC m=+1272.194315851 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" (UID: "1d86df9d-86a7-4980-abd0-488d98f6b2fb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:44 crc kubenswrapper[4702]: I1203 11:24:44.749088 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" event={"ID":"c43c86a0-692f-406f-871a-24a14f24ed77","Type":"ContainerStarted","Data":"9aefe7188be3b31b235aec167dd6ef6476a56c6b6bda7c494133a07d2a5dcc76"} Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.753379 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podUID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.756515 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.761478 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" Dec 03 11:24:44 crc kubenswrapper[4702]: E1203 11:24:44.762343 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" Dec 03 11:24:46 crc kubenswrapper[4702]: I1203 11:24:46.342814 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:46 crc kubenswrapper[4702]: I1203 11:24:46.342934 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:46 crc kubenswrapper[4702]: I1203 11:24:46.343023 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:46 crc kubenswrapper[4702]: E1203 11:24:46.343018 4702 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:46 crc kubenswrapper[4702]: E1203 11:24:46.343120 4702 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 11:24:46 crc kubenswrapper[4702]: E1203 11:24:46.343125 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert podName:a7faac4b-b558-4106-af27-4daf6a1db1af nodeName:}" failed. No retries permitted until 2025-12-03 11:24:54.343107915 +0000 UTC m=+1278.179036379 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert") pod "infra-operator-controller-manager-57548d458d-98mxd" (UID: "a7faac4b-b558-4106-af27-4daf6a1db1af") : secret "infra-operator-webhook-server-cert" not found Dec 03 11:24:46 crc kubenswrapper[4702]: E1203 11:24:46.343183 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:50.343168326 +0000 UTC m=+1274.179096790 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "metrics-server-cert" not found Dec 03 11:24:46 crc kubenswrapper[4702]: E1203 11:24:46.343120 4702 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 11:24:46 crc kubenswrapper[4702]: E1203 11:24:46.343213 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:50.343205027 +0000 UTC m=+1274.179133501 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "webhook-server-cert" not found Dec 03 11:24:46 crc kubenswrapper[4702]: I1203 11:24:46.877956 4702 patch_prober.go:28] interesting pod/metrics-server-6975dd785d-5bvc2 container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.76:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 11:24:46 crc kubenswrapper[4702]: I1203 11:24:46.878030 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" podUID="56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.76:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:24:46 crc kubenswrapper[4702]: I1203 11:24:46.878038 4702 patch_prober.go:28] interesting pod/metrics-server-6975dd785d-5bvc2 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 11:24:46 crc kubenswrapper[4702]: I1203 11:24:46.878141 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" podUID="56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 11:24:48 crc kubenswrapper[4702]: I1203 11:24:48.395170 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:48 crc kubenswrapper[4702]: E1203 11:24:48.395322 4702 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:48 crc kubenswrapper[4702]: E1203 11:24:48.396820 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert podName:1d86df9d-86a7-4980-abd0-488d98f6b2fb nodeName:}" failed. No retries permitted until 2025-12-03 11:24:56.396793701 +0000 UTC m=+1280.232722165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" (UID: "1d86df9d-86a7-4980-abd0-488d98f6b2fb") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 11:24:50 crc kubenswrapper[4702]: I1203 11:24:50.441492 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:50 crc kubenswrapper[4702]: I1203 11:24:50.442378 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:50 crc kubenswrapper[4702]: E1203 11:24:50.441929 4702 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 11:24:50 crc kubenswrapper[4702]: E1203 11:24:50.442646 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:58.442623735 +0000 UTC m=+1282.278552199 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "metrics-server-cert" not found Dec 03 11:24:50 crc kubenswrapper[4702]: E1203 11:24:50.442573 4702 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 11:24:50 crc kubenswrapper[4702]: E1203 11:24:50.443125 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs podName:b877c7a7-0b88-4238-8a21-314ef1525996 nodeName:}" failed. No retries permitted until 2025-12-03 11:24:58.443111159 +0000 UTC m=+1282.279039623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs") pod "openstack-operator-controller-manager-c98f8bd-8mv9c" (UID: "b877c7a7-0b88-4238-8a21-314ef1525996") : secret "webhook-server-cert" not found Dec 03 11:24:54 crc kubenswrapper[4702]: I1203 11:24:54.459497 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:54 crc kubenswrapper[4702]: I1203 11:24:54.468576 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7faac4b-b558-4106-af27-4daf6a1db1af-cert\") pod \"infra-operator-controller-manager-57548d458d-98mxd\" (UID: \"a7faac4b-b558-4106-af27-4daf6a1db1af\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:54 crc kubenswrapper[4702]: I1203 11:24:54.590057 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:24:55 crc kubenswrapper[4702]: I1203 11:24:55.908327 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:24:55 crc kubenswrapper[4702]: I1203 11:24:55.908393 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:24:55 crc kubenswrapper[4702]: I1203 11:24:55.908451 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:24:55 crc kubenswrapper[4702]: I1203 11:24:55.910850 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ceadc6ee9df8857f4d601383951a406af62e876eb34d70ea87811eb3741ae2d"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:24:55 crc kubenswrapper[4702]: I1203 11:24:55.910965 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://3ceadc6ee9df8857f4d601383951a406af62e876eb34d70ea87811eb3741ae2d" gracePeriod=600 Dec 03 11:24:56 crc kubenswrapper[4702]: I1203 11:24:56.417530 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:56 crc kubenswrapper[4702]: I1203 11:24:56.434522 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d86df9d-86a7-4980-abd0-488d98f6b2fb-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9\" (UID: \"1d86df9d-86a7-4980-abd0-488d98f6b2fb\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:56 crc kubenswrapper[4702]: I1203 11:24:56.585683 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:24:56 crc kubenswrapper[4702]: I1203 11:24:56.958098 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="3ceadc6ee9df8857f4d601383951a406af62e876eb34d70ea87811eb3741ae2d" exitCode=0 Dec 03 11:24:56 crc kubenswrapper[4702]: I1203 11:24:56.958160 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"3ceadc6ee9df8857f4d601383951a406af62e876eb34d70ea87811eb3741ae2d"} Dec 03 11:24:56 crc kubenswrapper[4702]: I1203 11:24:56.958269 4702 scope.go:117] "RemoveContainer" containerID="1331b116dc6549b4ff553b30c2ad8688204f150ccd437e40dc657307e56d7aa3" Dec 03 11:24:58 crc kubenswrapper[4702]: I1203 11:24:58.453387 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:58 crc kubenswrapper[4702]: I1203 11:24:58.453897 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:58 crc kubenswrapper[4702]: I1203 11:24:58.459878 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-webhook-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:58 crc kubenswrapper[4702]: I1203 11:24:58.459891 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b877c7a7-0b88-4238-8a21-314ef1525996-metrics-certs\") pod \"openstack-operator-controller-manager-c98f8bd-8mv9c\" (UID: \"b877c7a7-0b88-4238-8a21-314ef1525996\") " pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:58 crc kubenswrapper[4702]: I1203 11:24:58.734388 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:24:59 crc kubenswrapper[4702]: E1203 11:24:59.633322 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 03 11:24:59 crc kubenswrapper[4702]: E1203 11:24:59.633640 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mpthf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-gqqgw_openstack-operators(1a7e4f08-8a48-44d5-944b-4eaf9d9518b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:06 crc kubenswrapper[4702]: E1203 11:25:06.250644 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 03 11:25:06 crc kubenswrapper[4702]: E1203 11:25:06.252179 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-759v8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-2pcqv_openstack-operators(5cecb29f-7ef9-4177-8e01-a776b70bbb03): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:06 crc kubenswrapper[4702]: E1203 11:25:06.434864 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d" Dec 03 11:25:06 crc kubenswrapper[4702]: E1203 11:25:06.435144 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vds2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-psnhp_openstack-operators(b6faaca6-f017-42ac-95e4-d73ae3e8e519): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:06 crc kubenswrapper[4702]: E1203 11:25:06.996002 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 03 11:25:07 crc kubenswrapper[4702]: E1203 11:25:07.119925 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vltk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-lp88c_openstack-operators(62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:07 crc kubenswrapper[4702]: E1203 11:25:07.796481 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 03 11:25:07 crc kubenswrapper[4702]: E1203 11:25:07.796716 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-td7jl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-htxmz_openstack-operators(530ef793-9485-4c45-86ba-531906f2085a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:15 crc kubenswrapper[4702]: E1203 11:25:15.123410 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 03 11:25:15 crc kubenswrapper[4702]: E1203 11:25:15.124207 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l497k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-m5trg_openstack-operators(8f6320ff-4661-46be-80e1-8d97f09fe789): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:15 crc kubenswrapper[4702]: E1203 11:25:15.819896 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 03 11:25:15 crc kubenswrapper[4702]: E1203 11:25:15.820423 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cfghx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-kg6p7_openstack-operators(4b90477f-d1b5-4f03-ab08-2476d44a9cff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:16 crc kubenswrapper[4702]: E1203 11:25:16.627217 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 03 11:25:16 crc kubenswrapper[4702]: E1203 11:25:16.627489 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qmf62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-7xg4t_openstack-operators(523c06cc-9816-4252-ac00-dc7928dae009): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:17 crc kubenswrapper[4702]: E1203 11:25:17.574155 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 03 11:25:17 crc kubenswrapper[4702]: E1203 11:25:17.574438 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4mhk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-4pkkr_openstack-operators(224e5de0-3f58-4243-80e5-212cf016ea46): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:18 crc kubenswrapper[4702]: E1203 11:25:18.164044 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 03 11:25:18 crc kubenswrapper[4702]: E1203 11:25:18.164562 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9s4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-w2vmt_openstack-operators(182ca1cb-9499-4cf7-aeae-c35c7038814c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:22 crc kubenswrapper[4702]: E1203 11:25:22.246574 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 03 11:25:22 crc kubenswrapper[4702]: E1203 11:25:22.247464 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fj2tn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-m2bfb_openstack-operators(5e7b4134-2b34-4b36-ad61-8e681df197df): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:22 crc kubenswrapper[4702]: E1203 11:25:22.937665 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 03 11:25:22 crc kubenswrapper[4702]: E1203 11:25:22.937906 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lx6vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-t27c4_openstack-operators(1d60d4ab-7bac-4fd1-9aad-c07ba1513d41): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:23 crc kubenswrapper[4702]: E1203 11:25:23.989515 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 03 11:25:23 crc kubenswrapper[4702]: E1203 11:25:23.990053 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgcth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-nj4tn_openstack-operators(afc37ae6-c944-4cb1-81b6-c810ea1c3b31): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:24 crc kubenswrapper[4702]: E1203 11:25:24.510705 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 03 11:25:24 crc kubenswrapper[4702]: E1203 11:25:24.510948 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dd2zz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-ntzds_openstack-operators(8de75640-5551-4d04-830d-64f0fbb7847a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:24 crc kubenswrapper[4702]: E1203 11:25:24.940457 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.44:5001/openstack-k8s-operators/telemetry-operator:5ac490dd0dd698a3c89e78f18e444fd28325bc7d" Dec 03 11:25:24 crc kubenswrapper[4702]: E1203 11:25:24.940532 4702 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.44:5001/openstack-k8s-operators/telemetry-operator:5ac490dd0dd698a3c89e78f18e444fd28325bc7d" Dec 03 11:25:24 crc kubenswrapper[4702]: E1203 11:25:24.940687 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.44:5001/openstack-k8s-operators/telemetry-operator:5ac490dd0dd698a3c89e78f18e444fd28325bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nbmhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-f8bdcbf7f-4tp6n_openstack-operators(5edf270b-74cb-42d2-82dc-7953f243c6dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:31 crc kubenswrapper[4702]: I1203 11:25:31.499515 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-98mxd"] Dec 03 11:25:31 crc kubenswrapper[4702]: I1203 11:25:31.512347 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9"] Dec 03 11:25:31 crc kubenswrapper[4702]: I1203 11:25:31.612155 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c"] Dec 03 11:25:33 crc kubenswrapper[4702]: W1203 11:25:33.160172 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb877c7a7_0b88_4238_8a21_314ef1525996.slice/crio-98487fdb6c97612892078e828f816262293a225600a5bc36cdf93d3d6e5872ef WatchSource:0}: Error finding container 98487fdb6c97612892078e828f816262293a225600a5bc36cdf93d3d6e5872ef: Status 404 returned error can't find the container with id 98487fdb6c97612892078e828f816262293a225600a5bc36cdf93d3d6e5872ef Dec 03 11:25:33 crc kubenswrapper[4702]: I1203 11:25:33.362158 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" event={"ID":"1d86df9d-86a7-4980-abd0-488d98f6b2fb","Type":"ContainerStarted","Data":"896fb22ba10055008d379d32b1ad98e466be541b29e17a23a2479ed3d953e4c8"} Dec 03 11:25:33 crc kubenswrapper[4702]: I1203 11:25:33.365093 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"51258198d84bd78a94c0b0549a633061f0278b4c24ceaa27b8c81a77f3277a36"} Dec 03 11:25:33 crc kubenswrapper[4702]: I1203 11:25:33.366860 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" event={"ID":"a7faac4b-b558-4106-af27-4daf6a1db1af","Type":"ContainerStarted","Data":"3577a840cd303d4d3fd3656ef64e191c07c50c3002468e6e964ffe50822114aa"} Dec 03 11:25:33 crc kubenswrapper[4702]: I1203 11:25:33.368217 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" event={"ID":"b877c7a7-0b88-4238-8a21-314ef1525996","Type":"ContainerStarted","Data":"98487fdb6c97612892078e828f816262293a225600a5bc36cdf93d3d6e5872ef"} Dec 03 11:25:34 crc kubenswrapper[4702]: I1203 11:25:34.394790 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" event={"ID":"9b295e92-630f-4544-b741-50ece5e79f4c","Type":"ContainerStarted","Data":"f87a5729b99dd275d68e1d993ff34e33b89e55c2796b91208a6d8802a0d3eae1"} Dec 03 11:25:34 crc kubenswrapper[4702]: I1203 11:25:34.404676 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" event={"ID":"e3c1b694-60b8-4b5d-b8d5-40418e60aa4b","Type":"ContainerStarted","Data":"86953783f54e9199d76396ee15956d161d1b8dda1740343579cae94ce2d5def9"} Dec 03 11:25:34 crc kubenswrapper[4702]: I1203 11:25:34.412630 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" event={"ID":"84fc908a-9418-4e6e-ac17-9e725524f9ce","Type":"ContainerStarted","Data":"d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c"} Dec 03 11:25:35 crc kubenswrapper[4702]: I1203 11:25:35.426038 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" event={"ID":"ae6dac10-29ba-4bb8-8a0c-68a2bad519af","Type":"ContainerStarted","Data":"17945cb0a41c87bbba4777c9e2febdbd11b845c886b7607fc60a0bbf1d204637"} Dec 03 11:25:35 crc kubenswrapper[4702]: E1203 11:25:35.446529 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 03 11:25:35 crc kubenswrapper[4702]: E1203 11:25:35.446806 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4tfnb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-ckjgv_openstack-operators(c43c86a0-692f-406f-871a-24a14f24ed77): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:35 crc kubenswrapper[4702]: E1203 11:25:35.447906 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" podUID="c43c86a0-692f-406f-871a-24a14f24ed77" Dec 03 11:25:36 crc kubenswrapper[4702]: I1203 11:25:36.462278 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" event={"ID":"b877c7a7-0b88-4238-8a21-314ef1525996","Type":"ContainerStarted","Data":"0e840e9da35d051022fe6af1a772dec9473e404d679a1bd2ed6b9aad3fa41562"} Dec 03 11:25:36 crc kubenswrapper[4702]: I1203 11:25:36.464271 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:25:36 crc kubenswrapper[4702]: E1203 11:25:36.464860 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" podUID="c43c86a0-692f-406f-871a-24a14f24ed77" Dec 03 11:25:36 crc kubenswrapper[4702]: I1203 11:25:36.509890 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" podStartSLOduration=57.509851581 podStartE2EDuration="57.509851581s" podCreationTimestamp="2025-12-03 11:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:25:36.494627051 +0000 UTC m=+1320.330555515" watchObservedRunningTime="2025-12-03 11:25:36.509851581 +0000 UTC m=+1320.345780045" Dec 03 11:25:43 crc kubenswrapper[4702]: E1203 11:25:43.329259 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 11:25:43 crc kubenswrapper[4702]: E1203 11:25:43.332174 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qmf62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-7xg4t_openstack-operators(523c06cc-9816-4252-ac00-dc7928dae009): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 11:25:43 crc kubenswrapper[4702]: E1203 11:25:43.333719 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" podUID="523c06cc-9816-4252-ac00-dc7928dae009" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.359487 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.359992 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nbmhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-f8bdcbf7f-4tp6n_openstack-operators(5edf270b-74cb-42d2-82dc-7953f243c6dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.361306 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" podUID="5edf270b-74cb-42d2-82dc-7953f243c6dc" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.362964 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.363091 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vgcth,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-nj4tn_openstack-operators(afc37ae6-c944-4cb1-81b6-c810ea1c3b31): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.364642 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.383280 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.383484 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5hqhw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-57548d458d-98mxd_openstack-operators(a7faac4b-b558-4106-af27-4daf6a1db1af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.384826 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\": context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.385076 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vds2r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-psnhp_openstack-operators(b6faaca6-f017-42ac-95e4-d73ae3e8e519): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\": context canceled" logger="UnhandledError" Dec 03 11:25:46 crc kubenswrapper[4702]: E1203 11:25:46.386392 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2: Get \\\"https://quay.io/v2/openstack-k8s-operators/kube-rbac-proxy/blobs/sha256:723607448b0abc536cd883abffcf6942c1c562a48117db73f6fe693d99395ee2\\\": context canceled\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" podUID="b6faaca6-f017-42ac-95e4-d73ae3e8e519" Dec 03 11:25:48 crc kubenswrapper[4702]: E1203 11:25:48.434712 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 11:25:48 crc kubenswrapper[4702]: E1203 11:25:48.436405 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vltk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-lp88c_openstack-operators(62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:48 crc kubenswrapper[4702]: E1203 11:25:48.437878 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podUID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" Dec 03 11:25:48 crc kubenswrapper[4702]: I1203 11:25:48.743493 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.458937 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" podUID="530ef793-9485-4c45-86ba-531906f2085a" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.460134 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" podUID="5cecb29f-7ef9-4177-8e01-a776b70bbb03" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.470151 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.470617 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vt5sx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9_openstack-operators(1d86df9d-86a7-4980-abd0-488d98f6b2fb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.487214 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.487408 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cfghx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-kg6p7_openstack-operators(4b90477f-d1b5-4f03-ab08-2476d44a9cff): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.488618 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" podUID="4b90477f-d1b5-4f03-ab08-2476d44a9cff" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.495649 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.495885 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mpthf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-gqqgw_openstack-operators(1a7e4f08-8a48-44d5-944b-4eaf9d9518b5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.497286 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" podUID="1a7e4f08-8a48-44d5-944b-4eaf9d9518b5" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.681800 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" event={"ID":"1d60d4ab-7bac-4fd1-9aad-c07ba1513d41","Type":"ContainerStarted","Data":"5f08388be65e3fb2d7be9372d0395d4ee00165aee13c00256d445314658b72f2"} Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.687494 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" event={"ID":"5cecb29f-7ef9-4177-8e01-a776b70bbb03","Type":"ContainerStarted","Data":"a29d22ccee6300d4535f645514fbed5554b11a35f12d92efa7f4af32f187fadd"} Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.701370 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" event={"ID":"530ef793-9485-4c45-86ba-531906f2085a","Type":"ContainerStarted","Data":"3b0f81fddd8f1387be344d90269cbb736296a37626ce6593a93089a57f08ba9b"} Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.717845 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" event={"ID":"9b295e92-630f-4544-b741-50ece5e79f4c","Type":"ContainerStarted","Data":"9e4968d4bb5a4d34f4dedf49c8715c8078f0a2dc2f52bde53610a8df24c0ca05"} Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.718972 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.721992 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" event={"ID":"e3c1b694-60b8-4b5d-b8d5-40418e60aa4b","Type":"ContainerStarted","Data":"a9effd041a1e0f660c8f0f22aa177bba424deb7d88b527de835a3e765d427044"} Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.722539 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.725461 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.731096 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.731735 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" event={"ID":"84fc908a-9418-4e6e-ac17-9e725524f9ce","Type":"ContainerStarted","Data":"63e9ec66ec0430d60e130750b4f498ca81fc5b6cee8d0ef86783f44399ecc503"} Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.733464 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.736599 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" event={"ID":"ae6dac10-29ba-4bb8-8a0c-68a2bad519af","Type":"ContainerStarted","Data":"75dff6b99694e4af3b8353817381490cb88035b47c40929707d6a806ca0f6152"} Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.736644 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.742085 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.745279 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.766837 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.814856 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" podStartSLOduration=5.273615626 podStartE2EDuration="1m13.81483375s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.371155649 +0000 UTC m=+1266.207084113" lastFinishedPulling="2025-12-03 11:25:50.912373773 +0000 UTC m=+1334.748302237" observedRunningTime="2025-12-03 11:25:51.795147994 +0000 UTC m=+1335.631076478" watchObservedRunningTime="2025-12-03 11:25:51.81483375 +0000 UTC m=+1335.650762214" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.855710 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podStartSLOduration=5.169773075 podStartE2EDuration="1m12.855688595s" podCreationTimestamp="2025-12-03 11:24:39 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.227627676 +0000 UTC m=+1267.063556140" lastFinishedPulling="2025-12-03 11:25:50.913543186 +0000 UTC m=+1334.749471660" observedRunningTime="2025-12-03 11:25:51.847615807 +0000 UTC m=+1335.683544301" watchObservedRunningTime="2025-12-03 11:25:51.855688595 +0000 UTC m=+1335.691617059" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.906321 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" podUID="224e5de0-3f58-4243-80e5-212cf016ea46" Dec 03 11:25:51 crc kubenswrapper[4702]: E1203 11:25:51.906973 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" podUID="1d86df9d-86a7-4980-abd0-488d98f6b2fb" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.931997 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" podStartSLOduration=6.49557912 podStartE2EDuration="1m14.931971521s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.422451479 +0000 UTC m=+1266.258379943" lastFinishedPulling="2025-12-03 11:25:50.85884388 +0000 UTC m=+1334.694772344" observedRunningTime="2025-12-03 11:25:51.906217853 +0000 UTC m=+1335.742146317" watchObservedRunningTime="2025-12-03 11:25:51.931971521 +0000 UTC m=+1335.767899985" Dec 03 11:25:51 crc kubenswrapper[4702]: I1203 11:25:51.969576 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" podStartSLOduration=6.305918344 podStartE2EDuration="1m13.969547123s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.19198619 +0000 UTC m=+1267.027914654" lastFinishedPulling="2025-12-03 11:25:50.855614969 +0000 UTC m=+1334.691543433" observedRunningTime="2025-12-03 11:25:51.944262349 +0000 UTC m=+1335.780190813" watchObservedRunningTime="2025-12-03 11:25:51.969547123 +0000 UTC m=+1335.805475597" Dec 03 11:25:52 crc kubenswrapper[4702]: I1203 11:25:52.751863 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" event={"ID":"1d86df9d-86a7-4980-abd0-488d98f6b2fb","Type":"ContainerStarted","Data":"acf1d89840eeb4476163ea807f35c03c9f9792692a8673aeca6ce4397b1d9e27"} Dec 03 11:25:52 crc kubenswrapper[4702]: I1203 11:25:52.761363 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" event={"ID":"224e5de0-3f58-4243-80e5-212cf016ea46","Type":"ContainerStarted","Data":"ed9297b665b4b37eb6dd8fc1340fedaad679ebbd06de2e68fc5230fef702730c"} Dec 03 11:25:52 crc kubenswrapper[4702]: E1203 11:25:52.762083 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" podUID="1d86df9d-86a7-4980-abd0-488d98f6b2fb" Dec 03 11:25:52 crc kubenswrapper[4702]: E1203 11:25:52.836717 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 11:25:52 crc kubenswrapper[4702]: E1203 11:25:52.836934 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dd2zz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-ntzds_openstack-operators(8de75640-5551-4d04-830d-64f0fbb7847a): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 11:25:52 crc kubenswrapper[4702]: E1203 11:25:52.838131 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" Dec 03 11:25:53 crc kubenswrapper[4702]: I1203 11:25:53.774637 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" event={"ID":"62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d","Type":"ContainerStarted","Data":"1e38392069abb63a138f8151a298d7c9f511dc25106660eba9fd42ce806c269b"} Dec 03 11:25:53 crc kubenswrapper[4702]: I1203 11:25:53.777637 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" event={"ID":"5cecb29f-7ef9-4177-8e01-a776b70bbb03","Type":"ContainerStarted","Data":"d4e57b0ce090fa77b9b12fade4f80ac494f3f9b623fb0b097b162890b3d80d10"} Dec 03 11:25:53 crc kubenswrapper[4702]: I1203 11:25:53.778883 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 11:25:53 crc kubenswrapper[4702]: I1203 11:25:53.784318 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" event={"ID":"b6faaca6-f017-42ac-95e4-d73ae3e8e519","Type":"ContainerStarted","Data":"9fc58cd58b7eec4e8d1c0d978efb544179b4db4a16029cba91aa2efa1b5d14dc"} Dec 03 11:25:53 crc kubenswrapper[4702]: E1203 11:25:53.786196 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" podUID="1d86df9d-86a7-4980-abd0-488d98f6b2fb" Dec 03 11:25:53 crc kubenswrapper[4702]: I1203 11:25:53.809612 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" podStartSLOduration=5.46902295 podStartE2EDuration="1m15.809586112s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.509126308 +0000 UTC m=+1266.345054782" lastFinishedPulling="2025-12-03 11:25:52.84968948 +0000 UTC m=+1336.685617944" observedRunningTime="2025-12-03 11:25:53.801431511 +0000 UTC m=+1337.637359975" watchObservedRunningTime="2025-12-03 11:25:53.809586112 +0000 UTC m=+1337.645514576" Dec 03 11:25:54 crc kubenswrapper[4702]: E1203 11:25:54.060737 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" podUID="182ca1cb-9499-4cf7-aeae-c35c7038814c" Dec 03 11:25:54 crc kubenswrapper[4702]: E1203 11:25:54.214813 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" podUID="8f6320ff-4661-46be-80e1-8d97f09fe789" Dec 03 11:25:54 crc kubenswrapper[4702]: E1203 11:25:54.353242 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" podUID="a7faac4b-b558-4106-af27-4daf6a1db1af" Dec 03 11:25:54 crc kubenswrapper[4702]: E1203 11:25:54.492598 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" podUID="5e7b4134-2b34-4b36-ad61-8e681df197df" Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.797553 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" event={"ID":"a7faac4b-b558-4106-af27-4daf6a1db1af","Type":"ContainerStarted","Data":"f7092736d66970a4c24dd561faa93241ff2bb2424dafc73b9f76cb8cd00147b8"} Dec 03 11:25:54 crc kubenswrapper[4702]: E1203 11:25:54.799834 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" podUID="a7faac4b-b558-4106-af27-4daf6a1db1af" Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.803205 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" event={"ID":"b6faaca6-f017-42ac-95e4-d73ae3e8e519","Type":"ContainerStarted","Data":"68576ccc0c7b3222a988d884d1f9e2002dae6136fbf43442a601438db772cc8e"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.803345 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.812449 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" event={"ID":"62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d","Type":"ContainerStarted","Data":"4beb84db732b8ddf4a6a1511ee029380b27431dea5b647a119af78a7ab73b8f0"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.812626 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.815109 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" event={"ID":"c43c86a0-692f-406f-871a-24a14f24ed77","Type":"ContainerStarted","Data":"3de49cac20fba2b5d89e36e8a1febef0771c4a2f6530b0b8118fd1204b4b3467"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.824148 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" event={"ID":"5edf270b-74cb-42d2-82dc-7953f243c6dc","Type":"ContainerStarted","Data":"7efca528e823b01b4fd40beeaaf722a407970beb3251c2dd80e2cd6d65ee167a"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.827330 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" event={"ID":"1a7e4f08-8a48-44d5-944b-4eaf9d9518b5","Type":"ContainerStarted","Data":"9b12470bed66eccb99c82655a1ddea59de204202edd291476d12f76d6b0aec2d"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.844395 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" event={"ID":"4b90477f-d1b5-4f03-ab08-2476d44a9cff","Type":"ContainerStarted","Data":"5be622def27ef22f65bbc45ceb45e75aff6e6caa3a727b8330e5e381c345f18f"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.859297 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" event={"ID":"523c06cc-9816-4252-ac00-dc7928dae009","Type":"ContainerStarted","Data":"e859796e4bcd364bb086f9abd97dfb46d6727b4aaa707a06551a83b1972eadee"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.859363 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" event={"ID":"523c06cc-9816-4252-ac00-dc7928dae009","Type":"ContainerStarted","Data":"55c2f2583dccdbef6194c35e1440f482b93fc94f3162da2f718bb964d13463b9"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.859607 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.865181 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" podStartSLOduration=7.374562468 podStartE2EDuration="1m16.865158016s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.220928527 +0000 UTC m=+1267.056856991" lastFinishedPulling="2025-12-03 11:25:52.711524075 +0000 UTC m=+1336.547452539" observedRunningTime="2025-12-03 11:25:54.859381813 +0000 UTC m=+1338.695310277" watchObservedRunningTime="2025-12-03 11:25:54.865158016 +0000 UTC m=+1338.701086490" Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.895262 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" event={"ID":"182ca1cb-9499-4cf7-aeae-c35c7038814c","Type":"ContainerStarted","Data":"186b1d061dcf92e5aec09d085b1b9b272c92a06df609499723ddf846601034e0"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.899286 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.907448 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" event={"ID":"8f6320ff-4661-46be-80e1-8d97f09fe789","Type":"ContainerStarted","Data":"3297dc22e07f23cba6b541b63545a6b0a1dfa34c88beec3dbc796959d44c7ea5"} Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.985638 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podStartSLOduration=7.371446716 podStartE2EDuration="1m17.985608561s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.452177479 +0000 UTC m=+1266.288105943" lastFinishedPulling="2025-12-03 11:25:53.066339324 +0000 UTC m=+1336.902267788" observedRunningTime="2025-12-03 11:25:54.907168704 +0000 UTC m=+1338.743097168" watchObservedRunningTime="2025-12-03 11:25:54.985608561 +0000 UTC m=+1338.821537025" Dec 03 11:25:54 crc kubenswrapper[4702]: I1203 11:25:54.999166 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" podStartSLOduration=5.929868743 podStartE2EDuration="1m14.999139313s" podCreationTimestamp="2025-12-03 11:24:40 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.790094504 +0000 UTC m=+1267.626022968" lastFinishedPulling="2025-12-03 11:25:52.859365074 +0000 UTC m=+1336.695293538" observedRunningTime="2025-12-03 11:25:54.971278056 +0000 UTC m=+1338.807206530" watchObservedRunningTime="2025-12-03 11:25:54.999139313 +0000 UTC m=+1338.835067777" Dec 03 11:25:55 crc kubenswrapper[4702]: I1203 11:25:55.057454 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 11:25:55 crc kubenswrapper[4702]: I1203 11:25:55.057524 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" event={"ID":"5e7b4134-2b34-4b36-ad61-8e681df197df","Type":"ContainerStarted","Data":"bb24d8a7ee9981ade06dcf4103c37ac6de374a62c871b5e463e49c6042db5257"} Dec 03 11:25:55 crc kubenswrapper[4702]: I1203 11:25:55.057551 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" event={"ID":"224e5de0-3f58-4243-80e5-212cf016ea46","Type":"ContainerStarted","Data":"179dbf9944d12c5daada8458b3a3d69ede690ca68d5ca87448c664ec2c4efe99"} Dec 03 11:25:55 crc kubenswrapper[4702]: I1203 11:25:55.073230 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" event={"ID":"530ef793-9485-4c45-86ba-531906f2085a","Type":"ContainerStarted","Data":"473957d51e70605a3e125b6884cd25b6b52ad82ad8d2552f0a8232dd8adc2563"} Dec 03 11:25:55 crc kubenswrapper[4702]: I1203 11:25:55.073280 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 11:25:55 crc kubenswrapper[4702]: I1203 11:25:55.110045 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" podStartSLOduration=7.3828703430000004 podStartE2EDuration="1m17.110020927s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.078936764 +0000 UTC m=+1266.914865228" lastFinishedPulling="2025-12-03 11:25:52.806087348 +0000 UTC m=+1336.642015812" observedRunningTime="2025-12-03 11:25:55.093201442 +0000 UTC m=+1338.929129906" watchObservedRunningTime="2025-12-03 11:25:55.110020927 +0000 UTC m=+1338.945949391" Dec 03 11:25:55 crc kubenswrapper[4702]: I1203 11:25:55.200388 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" podStartSLOduration=7.551109664 podStartE2EDuration="1m18.200357141s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.421328737 +0000 UTC m=+1266.257257201" lastFinishedPulling="2025-12-03 11:25:53.070576214 +0000 UTC m=+1336.906504678" observedRunningTime="2025-12-03 11:25:55.166849054 +0000 UTC m=+1339.002777518" watchObservedRunningTime="2025-12-03 11:25:55.200357141 +0000 UTC m=+1339.036285605" Dec 03 11:25:55 crc kubenswrapper[4702]: I1203 11:25:55.209922 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" podStartSLOduration=6.972222021 podStartE2EDuration="1m18.20990027s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.449858513 +0000 UTC m=+1266.285786977" lastFinishedPulling="2025-12-03 11:25:53.687536762 +0000 UTC m=+1337.523465226" observedRunningTime="2025-12-03 11:25:55.200004701 +0000 UTC m=+1339.035933205" watchObservedRunningTime="2025-12-03 11:25:55.20990027 +0000 UTC m=+1339.045828734" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.127592 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.146974 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" event={"ID":"5edf270b-74cb-42d2-82dc-7953f243c6dc","Type":"ContainerStarted","Data":"15f669d9c72c494ca6f04b584b6704a3887438a4e982ce4538c95deb9d4dc828"} Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.148100 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.165845 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" event={"ID":"1d60d4ab-7bac-4fd1-9aad-c07ba1513d41","Type":"ContainerStarted","Data":"570ee7d47eaeeec5374d71e46a9426a01ca42ce6c943930d213b357636ee1310"} Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.166867 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.170370 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" podStartSLOduration=3.9560678620000003 podStartE2EDuration="1m19.170356638s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:24:40.399992734 +0000 UTC m=+1264.235921198" lastFinishedPulling="2025-12-03 11:25:55.61428151 +0000 UTC m=+1339.450209974" observedRunningTime="2025-12-03 11:25:56.166013715 +0000 UTC m=+1340.001942179" watchObservedRunningTime="2025-12-03 11:25:56.170356638 +0000 UTC m=+1340.006285102" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.189378 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" event={"ID":"1a7e4f08-8a48-44d5-944b-4eaf9d9518b5","Type":"ContainerStarted","Data":"632cfcd7281201ca63e78e092818bf7bd8d81f9c09ad50a7cfa8a65b608e2421"} Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.189571 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.202548 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" event={"ID":"4b90477f-d1b5-4f03-ab08-2476d44a9cff","Type":"ContainerStarted","Data":"5d41c8ba582b50c3e6406fea0a9f6ec0f9634fd22d5f6c8ced2297c764dc71fa"} Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.203448 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.206095 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" podStartSLOduration=7.59239004 podStartE2EDuration="1m17.206066057s" podCreationTimestamp="2025-12-03 11:24:39 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.192407571 +0000 UTC m=+1267.028336035" lastFinishedPulling="2025-12-03 11:25:52.806083578 +0000 UTC m=+1336.642012052" observedRunningTime="2025-12-03 11:25:56.190130746 +0000 UTC m=+1340.026059210" watchObservedRunningTime="2025-12-03 11:25:56.206066057 +0000 UTC m=+1340.041994531" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.208141 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 11:25:56 crc kubenswrapper[4702]: E1203 11:25:56.215201 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:09a6d0613ee2d3c1c809fc36c22678458ac271e0da87c970aec0a5339f5423f7\\\"\"" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" podUID="a7faac4b-b558-4106-af27-4daf6a1db1af" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.225744 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podStartSLOduration=7.47831365 podStartE2EDuration="1m18.225705142s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.243648169 +0000 UTC m=+1267.079576633" lastFinishedPulling="2025-12-03 11:25:53.991039661 +0000 UTC m=+1337.826968125" observedRunningTime="2025-12-03 11:25:56.220371611 +0000 UTC m=+1340.056300095" watchObservedRunningTime="2025-12-03 11:25:56.225705142 +0000 UTC m=+1340.061633606" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.268306 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" podStartSLOduration=6.065646529 podStartE2EDuration="1m19.268180023s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.412613031 +0000 UTC m=+1266.248541495" lastFinishedPulling="2025-12-03 11:25:55.615146525 +0000 UTC m=+1339.451074989" observedRunningTime="2025-12-03 11:25:56.257595264 +0000 UTC m=+1340.093523728" watchObservedRunningTime="2025-12-03 11:25:56.268180023 +0000 UTC m=+1340.104108497" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.320918 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" podStartSLOduration=8.726100656 podStartE2EDuration="1m19.320891293s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.421739639 +0000 UTC m=+1266.257668103" lastFinishedPulling="2025-12-03 11:25:53.016530286 +0000 UTC m=+1336.852458740" observedRunningTime="2025-12-03 11:25:56.319466442 +0000 UTC m=+1340.155394906" watchObservedRunningTime="2025-12-03 11:25:56.320891293 +0000 UTC m=+1340.156819757" Dec 03 11:25:56 crc kubenswrapper[4702]: I1203 11:25:56.351917 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" podStartSLOduration=8.758260334 podStartE2EDuration="1m19.351893509s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.470943769 +0000 UTC m=+1266.306872233" lastFinishedPulling="2025-12-03 11:25:53.064576954 +0000 UTC m=+1336.900505408" observedRunningTime="2025-12-03 11:25:56.346718713 +0000 UTC m=+1340.182647177" watchObservedRunningTime="2025-12-03 11:25:56.351893509 +0000 UTC m=+1340.187821973" Dec 03 11:25:57 crc kubenswrapper[4702]: I1203 11:25:57.223816 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" event={"ID":"182ca1cb-9499-4cf7-aeae-c35c7038814c","Type":"ContainerStarted","Data":"6ef1184c4c68f7dc5e1d367ac85a42328ff44e1699b9a00747f3de549e96858b"} Dec 03 11:25:57 crc kubenswrapper[4702]: I1203 11:25:57.226822 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" event={"ID":"8f6320ff-4661-46be-80e1-8d97f09fe789","Type":"ContainerStarted","Data":"04dfa9ccbc805c8c141db2c3940506e37b3216be0ec5cac7fc187c84a3ff08a6"} Dec 03 11:25:57 crc kubenswrapper[4702]: I1203 11:25:57.230375 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" event={"ID":"5e7b4134-2b34-4b36-ad61-8e681df197df","Type":"ContainerStarted","Data":"e5d93f7c2aeae05d5b5a0857770119288cc89f383174e584fbe8093a8a53fb6f"} Dec 03 11:25:57 crc kubenswrapper[4702]: I1203 11:25:57.237967 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 11:25:57 crc kubenswrapper[4702]: I1203 11:25:57.287424 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" podStartSLOduration=5.8688015 podStartE2EDuration="1m19.287402771s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:24:42.521017945 +0000 UTC m=+1266.356946409" lastFinishedPulling="2025-12-03 11:25:55.939619216 +0000 UTC m=+1339.775547680" observedRunningTime="2025-12-03 11:25:57.276277676 +0000 UTC m=+1341.112206150" watchObservedRunningTime="2025-12-03 11:25:57.287402771 +0000 UTC m=+1341.123331235" Dec 03 11:25:58 crc kubenswrapper[4702]: I1203 11:25:58.241236 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 11:25:58 crc kubenswrapper[4702]: I1203 11:25:58.242198 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 11:25:58 crc kubenswrapper[4702]: I1203 11:25:58.559132 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 11:25:58 crc kubenswrapper[4702]: I1203 11:25:58.578720 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 11:25:59 crc kubenswrapper[4702]: I1203 11:25:59.510668 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 11:25:59 crc kubenswrapper[4702]: I1203 11:25:59.510727 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 11:25:59 crc kubenswrapper[4702]: I1203 11:25:59.576823 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 11:25:59 crc kubenswrapper[4702]: I1203 11:25:59.702740 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 11:26:02 crc kubenswrapper[4702]: I1203 11:26:02.274859 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" event={"ID":"afc37ae6-c944-4cb1-81b6-c810ea1c3b31","Type":"ContainerStarted","Data":"70a76afb206f6c53eff7405e07515a63e8c1dfbc49a5e965c1f1b8cdfa8d3ce2"} Dec 03 11:26:02 crc kubenswrapper[4702]: I1203 11:26:02.274902 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" event={"ID":"afc37ae6-c944-4cb1-81b6-c810ea1c3b31","Type":"ContainerStarted","Data":"a5f052b45bce1deebc15a1f99f27c9cd83479bc87073cdc3ea74ce69db473dbc"} Dec 03 11:26:02 crc kubenswrapper[4702]: I1203 11:26:02.276249 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 11:26:02 crc kubenswrapper[4702]: I1203 11:26:02.297771 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podStartSLOduration=5.14023971 podStartE2EDuration="1m23.297716395s" podCreationTimestamp="2025-12-03 11:24:39 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.268862871 +0000 UTC m=+1267.104791335" lastFinishedPulling="2025-12-03 11:26:01.426339546 +0000 UTC m=+1345.262268020" observedRunningTime="2025-12-03 11:26:02.29650416 +0000 UTC m=+1346.132432624" watchObservedRunningTime="2025-12-03 11:26:02.297716395 +0000 UTC m=+1346.133644859" Dec 03 11:26:06 crc kubenswrapper[4702]: I1203 11:26:06.312775 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" event={"ID":"1d86df9d-86a7-4980-abd0-488d98f6b2fb","Type":"ContainerStarted","Data":"ff2e288d038ac5bd4e6dbbd98fd8278b38b12e1e35384bae5a54dcddc68654c1"} Dec 03 11:26:06 crc kubenswrapper[4702]: I1203 11:26:06.313928 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:26:06 crc kubenswrapper[4702]: I1203 11:26:06.347999 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" podStartSLOduration=55.804786315 podStartE2EDuration="1m28.347970814s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:25:33.160554535 +0000 UTC m=+1316.996482999" lastFinishedPulling="2025-12-03 11:26:05.703739034 +0000 UTC m=+1349.539667498" observedRunningTime="2025-12-03 11:26:06.345253917 +0000 UTC m=+1350.181182381" watchObservedRunningTime="2025-12-03 11:26:06.347970814 +0000 UTC m=+1350.183899288" Dec 03 11:26:08 crc kubenswrapper[4702]: I1203 11:26:08.172213 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 11:26:08 crc kubenswrapper[4702]: I1203 11:26:08.186467 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 11:26:08 crc kubenswrapper[4702]: I1203 11:26:08.249674 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 11:26:08 crc kubenswrapper[4702]: I1203 11:26:08.341470 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" event={"ID":"a7faac4b-b558-4106-af27-4daf6a1db1af","Type":"ContainerStarted","Data":"2592b010473ade5473c405f0122b4c1439f84cc64cba400976159766ced15210"} Dec 03 11:26:08 crc kubenswrapper[4702]: I1203 11:26:08.342159 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:26:08 crc kubenswrapper[4702]: I1203 11:26:08.372534 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" podStartSLOduration=56.922170931 podStartE2EDuration="1m31.372486206s" podCreationTimestamp="2025-12-03 11:24:37 +0000 UTC" firstStartedPulling="2025-12-03 11:25:33.156830359 +0000 UTC m=+1316.992758823" lastFinishedPulling="2025-12-03 11:26:07.607145634 +0000 UTC m=+1351.443074098" observedRunningTime="2025-12-03 11:26:08.368156054 +0000 UTC m=+1352.204084528" watchObservedRunningTime="2025-12-03 11:26:08.372486206 +0000 UTC m=+1352.208414670" Dec 03 11:26:08 crc kubenswrapper[4702]: I1203 11:26:08.576084 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 11:26:09 crc kubenswrapper[4702]: I1203 11:26:09.354885 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" event={"ID":"8de75640-5551-4d04-830d-64f0fbb7847a","Type":"ContainerStarted","Data":"30bb875fa0327e4b7cde1ffd453837e3ab22ac7099b54ad18683bd8c675c08ab"} Dec 03 11:26:09 crc kubenswrapper[4702]: I1203 11:26:09.355260 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" event={"ID":"8de75640-5551-4d04-830d-64f0fbb7847a","Type":"ContainerStarted","Data":"90043828fd23987e4d8ab1daec86921e7428498330c4d66bbe0c66f1386100c2"} Dec 03 11:26:09 crc kubenswrapper[4702]: I1203 11:26:09.356212 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 11:26:09 crc kubenswrapper[4702]: I1203 11:26:09.389382 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podStartSLOduration=5.949325053 podStartE2EDuration="1m31.389360377s" podCreationTimestamp="2025-12-03 11:24:38 +0000 UTC" firstStartedPulling="2025-12-03 11:24:43.260789543 +0000 UTC m=+1267.096718017" lastFinishedPulling="2025-12-03 11:26:08.700824877 +0000 UTC m=+1352.536753341" observedRunningTime="2025-12-03 11:26:09.388845052 +0000 UTC m=+1353.224773526" watchObservedRunningTime="2025-12-03 11:26:09.389360377 +0000 UTC m=+1353.225288841" Dec 03 11:26:09 crc kubenswrapper[4702]: I1203 11:26:09.510457 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 11:26:10 crc kubenswrapper[4702]: I1203 11:26:10.088600 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 11:26:14 crc kubenswrapper[4702]: I1203 11:26:14.597806 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 11:26:16 crc kubenswrapper[4702]: I1203 11:26:16.592185 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 11:26:19 crc kubenswrapper[4702]: I1203 11:26:19.560387 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.756501 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8j722"] Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.758626 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.774421 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv77g\" (UniqueName: \"kubernetes.io/projected/51a412e7-4275-4445-a558-0db52234278e-kube-api-access-qv77g\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.774827 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-utilities\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.774870 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-catalog-content\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.779624 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8j722"] Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.876342 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv77g\" (UniqueName: \"kubernetes.io/projected/51a412e7-4275-4445-a558-0db52234278e-kube-api-access-qv77g\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.876400 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-utilities\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.876440 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-catalog-content\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.877132 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-catalog-content\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.877181 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-utilities\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:22 crc kubenswrapper[4702]: I1203 11:26:22.907991 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv77g\" (UniqueName: \"kubernetes.io/projected/51a412e7-4275-4445-a558-0db52234278e-kube-api-access-qv77g\") pod \"redhat-operators-8j722\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:23 crc kubenswrapper[4702]: I1203 11:26:23.087263 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:23 crc kubenswrapper[4702]: I1203 11:26:23.677149 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8j722"] Dec 03 11:26:23 crc kubenswrapper[4702]: W1203 11:26:23.684469 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a412e7_4275_4445_a558_0db52234278e.slice/crio-6b7f8c7ca8d806f0c01d886eac462815e4e5deb8a42b60a15792aa4225041d7c WatchSource:0}: Error finding container 6b7f8c7ca8d806f0c01d886eac462815e4e5deb8a42b60a15792aa4225041d7c: Status 404 returned error can't find the container with id 6b7f8c7ca8d806f0c01d886eac462815e4e5deb8a42b60a15792aa4225041d7c Dec 03 11:26:24 crc kubenswrapper[4702]: I1203 11:26:24.492137 4702 generic.go:334] "Generic (PLEG): container finished" podID="51a412e7-4275-4445-a558-0db52234278e" containerID="5d665785b3bbd192b66640e6a1a08b3bb251e693022769edc4b78f7b209dca89" exitCode=0 Dec 03 11:26:24 crc kubenswrapper[4702]: I1203 11:26:24.492182 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8j722" event={"ID":"51a412e7-4275-4445-a558-0db52234278e","Type":"ContainerDied","Data":"5d665785b3bbd192b66640e6a1a08b3bb251e693022769edc4b78f7b209dca89"} Dec 03 11:26:24 crc kubenswrapper[4702]: I1203 11:26:24.492562 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8j722" event={"ID":"51a412e7-4275-4445-a558-0db52234278e","Type":"ContainerStarted","Data":"6b7f8c7ca8d806f0c01d886eac462815e4e5deb8a42b60a15792aa4225041d7c"} Dec 03 11:26:26 crc kubenswrapper[4702]: I1203 11:26:26.528251 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8j722" event={"ID":"51a412e7-4275-4445-a558-0db52234278e","Type":"ContainerStarted","Data":"0c8b9c373e7d8e582eea14528ba03533bfeff2d3f2f2967ced0a29933de5332d"} Dec 03 11:26:29 crc kubenswrapper[4702]: I1203 11:26:29.559602 4702 generic.go:334] "Generic (PLEG): container finished" podID="51a412e7-4275-4445-a558-0db52234278e" containerID="0c8b9c373e7d8e582eea14528ba03533bfeff2d3f2f2967ced0a29933de5332d" exitCode=0 Dec 03 11:26:29 crc kubenswrapper[4702]: I1203 11:26:29.559672 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8j722" event={"ID":"51a412e7-4275-4445-a558-0db52234278e","Type":"ContainerDied","Data":"0c8b9c373e7d8e582eea14528ba03533bfeff2d3f2f2967ced0a29933de5332d"} Dec 03 11:26:30 crc kubenswrapper[4702]: I1203 11:26:30.575997 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8j722" event={"ID":"51a412e7-4275-4445-a558-0db52234278e","Type":"ContainerStarted","Data":"783a44c081b159c829b5a00a3a1612e6eabeac3641bc7cc5b2bbe2d816d47350"} Dec 03 11:26:30 crc kubenswrapper[4702]: I1203 11:26:30.605877 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8j722" podStartSLOduration=3.083638488 podStartE2EDuration="8.60585768s" podCreationTimestamp="2025-12-03 11:26:22 +0000 UTC" firstStartedPulling="2025-12-03 11:26:24.494015048 +0000 UTC m=+1368.329943512" lastFinishedPulling="2025-12-03 11:26:30.01623423 +0000 UTC m=+1373.852162704" observedRunningTime="2025-12-03 11:26:30.600142768 +0000 UTC m=+1374.436071232" watchObservedRunningTime="2025-12-03 11:26:30.60585768 +0000 UTC m=+1374.441786144" Dec 03 11:26:33 crc kubenswrapper[4702]: I1203 11:26:33.087491 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:33 crc kubenswrapper[4702]: I1203 11:26:33.087872 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.136833 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8j722" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="registry-server" probeResult="failure" output=< Dec 03 11:26:34 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:26:34 crc kubenswrapper[4702]: > Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.444973 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6p7s4"] Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.448525 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.455888 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.455884 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.456479 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.456656 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-4r7xd" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.463919 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6p7s4"] Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.581351 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fzfg\" (UniqueName: \"kubernetes.io/projected/41674587-555a-4676-bd26-6732bdbb594b-kube-api-access-9fzfg\") pod \"dnsmasq-dns-675f4bcbfc-6p7s4\" (UID: \"41674587-555a-4676-bd26-6732bdbb594b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.581918 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41674587-555a-4676-bd26-6732bdbb594b-config\") pod \"dnsmasq-dns-675f4bcbfc-6p7s4\" (UID: \"41674587-555a-4676-bd26-6732bdbb594b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.595083 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6zjj"] Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.596720 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.694421 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.697935 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41674587-555a-4676-bd26-6732bdbb594b-config\") pod \"dnsmasq-dns-675f4bcbfc-6p7s4\" (UID: \"41674587-555a-4676-bd26-6732bdbb594b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.698032 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.698144 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fzfg\" (UniqueName: \"kubernetes.io/projected/41674587-555a-4676-bd26-6732bdbb594b-kube-api-access-9fzfg\") pod \"dnsmasq-dns-675f4bcbfc-6p7s4\" (UID: \"41674587-555a-4676-bd26-6732bdbb594b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.698206 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9djd8\" (UniqueName: \"kubernetes.io/projected/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-kube-api-access-9djd8\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.698259 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-config\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.699826 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6zjj"] Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.700420 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41674587-555a-4676-bd26-6732bdbb594b-config\") pod \"dnsmasq-dns-675f4bcbfc-6p7s4\" (UID: \"41674587-555a-4676-bd26-6732bdbb594b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.749800 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fzfg\" (UniqueName: \"kubernetes.io/projected/41674587-555a-4676-bd26-6732bdbb594b-kube-api-access-9fzfg\") pod \"dnsmasq-dns-675f4bcbfc-6p7s4\" (UID: \"41674587-555a-4676-bd26-6732bdbb594b\") " pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.770301 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.800899 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9djd8\" (UniqueName: \"kubernetes.io/projected/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-kube-api-access-9djd8\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.800964 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-config\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.801030 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.802048 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-config\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.802744 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:34 crc kubenswrapper[4702]: I1203 11:26:34.817946 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9djd8\" (UniqueName: \"kubernetes.io/projected/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-kube-api-access-9djd8\") pod \"dnsmasq-dns-78dd6ddcc-f6zjj\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:35 crc kubenswrapper[4702]: I1203 11:26:35.025178 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:26:35 crc kubenswrapper[4702]: I1203 11:26:35.910923 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6p7s4"] Dec 03 11:26:36 crc kubenswrapper[4702]: I1203 11:26:36.033467 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6zjj"] Dec 03 11:26:36 crc kubenswrapper[4702]: I1203 11:26:36.748362 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" event={"ID":"41674587-555a-4676-bd26-6732bdbb594b","Type":"ContainerStarted","Data":"ae691aa75a50ca76e3a2f18463e5581eac952b0f12320a755a338b7d751b2ef9"} Dec 03 11:26:36 crc kubenswrapper[4702]: I1203 11:26:36.751687 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" event={"ID":"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84","Type":"ContainerStarted","Data":"bc69edc23de4e207e0a2102c082b674d105e7b86e73cde533e54c7a589e80221"} Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.340884 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6p7s4"] Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.369237 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9z8nk"] Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.373206 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.387112 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9z8nk"] Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.404441 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-config\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.404534 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s2fp\" (UniqueName: \"kubernetes.io/projected/68d6c68f-856a-48b5-8ca3-e1165b430d65-kube-api-access-7s2fp\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.404816 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.509105 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.509302 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-config\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.509338 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s2fp\" (UniqueName: \"kubernetes.io/projected/68d6c68f-856a-48b5-8ca3-e1165b430d65-kube-api-access-7s2fp\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.510278 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-dns-svc\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.510737 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-config\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.532740 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s2fp\" (UniqueName: \"kubernetes.io/projected/68d6c68f-856a-48b5-8ca3-e1165b430d65-kube-api-access-7s2fp\") pod \"dnsmasq-dns-666b6646f7-9z8nk\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.703834 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.791231 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6zjj"] Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.848141 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2plx"] Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.850284 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.899914 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2plx"] Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.923879 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.923965 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqxm5\" (UniqueName: \"kubernetes.io/projected/da1dab47-7e05-48f7-84dc-7747cfa50aa8-kube-api-access-hqxm5\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:37 crc kubenswrapper[4702]: I1203 11:26:37.924020 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-config\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.027426 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.028173 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqxm5\" (UniqueName: \"kubernetes.io/projected/da1dab47-7e05-48f7-84dc-7747cfa50aa8-kube-api-access-hqxm5\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.028327 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-config\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.030002 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-config\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.032855 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.078905 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqxm5\" (UniqueName: \"kubernetes.io/projected/da1dab47-7e05-48f7-84dc-7747cfa50aa8-kube-api-access-hqxm5\") pod \"dnsmasq-dns-57d769cc4f-t2plx\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.343347 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.537368 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.540797 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.543186 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.547206 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.547216 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.547505 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.547802 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wmbwc" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.547948 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.548517 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.619149 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.669803 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.669881 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.669905 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-server-conf\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.669955 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.669985 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.670032 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.670106 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-config-data\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.670142 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85f53e1b-50d1-4249-ba44-5b2e5982ae36-pod-info\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.670239 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.670272 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85f53e1b-50d1-4249-ba44-5b2e5982ae36-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.670319 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-kube-api-access-2jfm9\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771495 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-kube-api-access-2jfm9\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771595 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771619 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771642 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-server-conf\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771663 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771685 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771703 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771748 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-config-data\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771785 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85f53e1b-50d1-4249-ba44-5b2e5982ae36-pod-info\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771851 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.771870 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85f53e1b-50d1-4249-ba44-5b2e5982ae36-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.772105 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.773081 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9z8nk"] Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.774003 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.774484 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.774729 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-server-conf\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.774848 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-config-data\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.775478 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.780355 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85f53e1b-50d1-4249-ba44-5b2e5982ae36-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.781886 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.781885 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.782852 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85f53e1b-50d1-4249-ba44-5b2e5982ae36-pod-info\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.830178 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-kube-api-access-2jfm9\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.835033 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " pod="openstack/rabbitmq-server-0" Dec 03 11:26:38 crc kubenswrapper[4702]: I1203 11:26:38.926785 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.105398 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.107799 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.112770 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.112834 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.112834 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.112978 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.112984 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.113073 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mq44d" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.113153 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.125800 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.151165 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2plx"] Dec 03 11:26:39 crc kubenswrapper[4702]: W1203 11:26:39.182111 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda1dab47_7e05_48f7_84dc_7747cfa50aa8.slice/crio-89462ce7db5dd805303fab58c2356a286025883a87b1ca94f4f32e186bdc42d0 WatchSource:0}: Error finding container 89462ce7db5dd805303fab58c2356a286025883a87b1ca94f4f32e186bdc42d0: Status 404 returned error can't find the container with id 89462ce7db5dd805303fab58c2356a286025883a87b1ca94f4f32e186bdc42d0 Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.187512 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.187620 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.187662 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.187724 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.187858 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.188689 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.188898 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.189288 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m265r\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-kube-api-access-m265r\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.189331 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.189563 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.189846 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.291513 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292105 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292189 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292232 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292318 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m265r\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-kube-api-access-m265r\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292346 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292395 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292436 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292502 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292553 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.292585 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.293378 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.293424 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.294327 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.295386 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.295818 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.299902 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.301149 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.302371 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.302892 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.311842 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.340111 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m265r\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-kube-api-access-m265r\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.368639 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.524240 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.802875 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" event={"ID":"68d6c68f-856a-48b5-8ca3-e1165b430d65","Type":"ContainerStarted","Data":"ecfb4b6a7b0852b29e7110061d2061f5b53711f6bab565a025802becbf6ec20d"} Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.822647 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" event={"ID":"da1dab47-7e05-48f7-84dc-7747cfa50aa8","Type":"ContainerStarted","Data":"89462ce7db5dd805303fab58c2356a286025883a87b1ca94f4f32e186bdc42d0"} Dec 03 11:26:39 crc kubenswrapper[4702]: I1203 11:26:39.885729 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:26:39 crc kubenswrapper[4702]: W1203 11:26:39.995596 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85f53e1b_50d1_4249_ba44_5b2e5982ae36.slice/crio-6b236d9838f722c8db01c3541bac607ddcf4d6120c77c7a90e3c770958beabef WatchSource:0}: Error finding container 6b236d9838f722c8db01c3541bac607ddcf4d6120c77c7a90e3c770958beabef: Status 404 returned error can't find the container with id 6b236d9838f722c8db01c3541bac607ddcf4d6120c77c7a90e3c770958beabef Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.258550 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.353215 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.441896 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.450354 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.450656 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.452200 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-m78ps" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.452347 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.467384 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.554543 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.654992 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.655125 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-kolla-config\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.655198 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.655280 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.655334 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.655435 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.655488 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-config-data-default\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.655531 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6cls\" (UniqueName: \"kubernetes.io/projected/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-kube-api-access-t6cls\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.758772 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.758856 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-kolla-config\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.758924 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.758972 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.759006 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.759059 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.759099 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-config-data-default\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.759135 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6cls\" (UniqueName: \"kubernetes.io/projected/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-kube-api-access-t6cls\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.763544 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-config-data-default\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.763961 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.766270 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.768077 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-kolla-config\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.768707 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.772276 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.779938 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.801312 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.802522 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6cls\" (UniqueName: \"kubernetes.io/projected/92266ac3-f0a6-4e68-9e88-9aa2900e1fe3-kube-api-access-t6cls\") pod \"openstack-galera-0\" (UID: \"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3\") " pod="openstack/openstack-galera-0" Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.891025 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c","Type":"ContainerStarted","Data":"ed0628de87c8aa18d782d459a2c06c2ae5cce05314fbcb1a857c3cacea0b55dd"} Dec 03 11:26:40 crc kubenswrapper[4702]: I1203 11:26:40.894162 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85f53e1b-50d1-4249-ba44-5b2e5982ae36","Type":"ContainerStarted","Data":"6b236d9838f722c8db01c3541bac607ddcf4d6120c77c7a90e3c770958beabef"} Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.223316 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.377745 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.383850 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.394636 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.395111 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.396181 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.409568 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-nvzj9" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.453940 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.520874 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rddh\" (UniqueName: \"kubernetes.io/projected/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-kube-api-access-4rddh\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.520920 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.521886 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.521961 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.522075 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.522123 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.522524 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.522577 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.626121 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rddh\" (UniqueName: \"kubernetes.io/projected/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-kube-api-access-4rddh\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.626223 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.626305 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.626336 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.626379 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.626410 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.626468 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.626500 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.628177 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.630312 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.630954 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.633870 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.635406 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.636353 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.657046 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.668542 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rddh\" (UniqueName: \"kubernetes.io/projected/c91e1dc8-ef80-407f-ac34-4c9ab29026f7-kube-api-access-4rddh\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.689516 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"c91e1dc8-ef80-407f-ac34-4c9ab29026f7\") " pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.707861 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.709421 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.716475 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.716702 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mfz4g" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.718935 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.728694 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.780275 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.833643 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ea851b4-124d-4472-9fd0-7b584da44ecc-config-data\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.834354 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea851b4-124d-4472-9fd0-7b584da44ecc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.834422 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea851b4-124d-4472-9fd0-7b584da44ecc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.834465 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmc75\" (UniqueName: \"kubernetes.io/projected/8ea851b4-124d-4472-9fd0-7b584da44ecc-kube-api-access-lmc75\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.834564 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ea851b4-124d-4472-9fd0-7b584da44ecc-kolla-config\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.991649 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ea851b4-124d-4472-9fd0-7b584da44ecc-config-data\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.991932 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea851b4-124d-4472-9fd0-7b584da44ecc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.991997 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea851b4-124d-4472-9fd0-7b584da44ecc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.992057 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmc75\" (UniqueName: \"kubernetes.io/projected/8ea851b4-124d-4472-9fd0-7b584da44ecc-kube-api-access-lmc75\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:41 crc kubenswrapper[4702]: I1203 11:26:41.992213 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ea851b4-124d-4472-9fd0-7b584da44ecc-kolla-config\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:41.994272 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/8ea851b4-124d-4472-9fd0-7b584da44ecc-kolla-config\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:41.996674 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ea851b4-124d-4472-9fd0-7b584da44ecc-config-data\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:42.007903 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ea851b4-124d-4472-9fd0-7b584da44ecc-memcached-tls-certs\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:42.030568 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ea851b4-124d-4472-9fd0-7b584da44ecc-combined-ca-bundle\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:42.056286 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmc75\" (UniqueName: \"kubernetes.io/projected/8ea851b4-124d-4472-9fd0-7b584da44ecc-kube-api-access-lmc75\") pod \"memcached-0\" (UID: \"8ea851b4-124d-4472-9fd0-7b584da44ecc\") " pod="openstack/memcached-0" Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:42.070790 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:42.366912 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 11:26:42 crc kubenswrapper[4702]: W1203 11:26:42.481252 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92266ac3_f0a6_4e68_9e88_9aa2900e1fe3.slice/crio-a1967941e067f1bcb64cb278f8c36e68b29c9e5e8ba8d8280fa6a6d6c7226cc8 WatchSource:0}: Error finding container a1967941e067f1bcb64cb278f8c36e68b29c9e5e8ba8d8280fa6a6d6c7226cc8: Status 404 returned error can't find the container with id a1967941e067f1bcb64cb278f8c36e68b29c9e5e8ba8d8280fa6a6d6c7226cc8 Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:42.762090 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 11:26:42 crc kubenswrapper[4702]: I1203 11:26:42.981482 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 11:26:43 crc kubenswrapper[4702]: I1203 11:26:43.032700 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c91e1dc8-ef80-407f-ac34-4c9ab29026f7","Type":"ContainerStarted","Data":"746fb8b7ffb3b2129ec7ef16b145d96ac6a2cb86aac4cc168fbab2e28727f98c"} Dec 03 11:26:43 crc kubenswrapper[4702]: I1203 11:26:43.036939 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3","Type":"ContainerStarted","Data":"a1967941e067f1bcb64cb278f8c36e68b29c9e5e8ba8d8280fa6a6d6c7226cc8"} Dec 03 11:26:43 crc kubenswrapper[4702]: W1203 11:26:43.122676 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ea851b4_124d_4472_9fd0_7b584da44ecc.slice/crio-1642e180b14bcafc2cbf8ab25f5d0552ea43901505350d9168df7863044ecbec WatchSource:0}: Error finding container 1642e180b14bcafc2cbf8ab25f5d0552ea43901505350d9168df7863044ecbec: Status 404 returned error can't find the container with id 1642e180b14bcafc2cbf8ab25f5d0552ea43901505350d9168df7863044ecbec Dec 03 11:26:43 crc kubenswrapper[4702]: I1203 11:26:43.213136 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:43 crc kubenswrapper[4702]: I1203 11:26:43.550290 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:43 crc kubenswrapper[4702]: I1203 11:26:43.657800 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8j722"] Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.168364 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.170093 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.182643 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-s8w4m" Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.218236 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.241983 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8ea851b4-124d-4472-9fd0-7b584da44ecc","Type":"ContainerStarted","Data":"1642e180b14bcafc2cbf8ab25f5d0552ea43901505350d9168df7863044ecbec"} Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.243309 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h9k2\" (UniqueName: \"kubernetes.io/projected/d7e1497f-e194-429b-add6-ee8e886fed8b-kube-api-access-4h9k2\") pod \"kube-state-metrics-0\" (UID: \"d7e1497f-e194-429b-add6-ee8e886fed8b\") " pod="openstack/kube-state-metrics-0" Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.345671 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h9k2\" (UniqueName: \"kubernetes.io/projected/d7e1497f-e194-429b-add6-ee8e886fed8b-kube-api-access-4h9k2\") pod \"kube-state-metrics-0\" (UID: \"d7e1497f-e194-429b-add6-ee8e886fed8b\") " pod="openstack/kube-state-metrics-0" Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.508392 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h9k2\" (UniqueName: \"kubernetes.io/projected/d7e1497f-e194-429b-add6-ee8e886fed8b-kube-api-access-4h9k2\") pod \"kube-state-metrics-0\" (UID: \"d7e1497f-e194-429b-add6-ee8e886fed8b\") " pod="openstack/kube-state-metrics-0" Dec 03 11:26:44 crc kubenswrapper[4702]: I1203 11:26:44.515467 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.184602 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5"] Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.208703 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.227948 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-86zg7" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.228166 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.242162 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5"] Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.269794 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-7x6j5\" (UID: \"4201f396-8ff3-4b7b-82d2-f26cc129b3f9\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.291340 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qprlh\" (UniqueName: \"kubernetes.io/projected/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-kube-api-access-qprlh\") pod \"observability-ui-dashboards-7d5fb4cbfb-7x6j5\" (UID: \"4201f396-8ff3-4b7b-82d2-f26cc129b3f9\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.330023 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8j722" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="registry-server" containerID="cri-o://783a44c081b159c829b5a00a3a1612e6eabeac3641bc7cc5b2bbe2d816d47350" gracePeriod=2 Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.393452 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qprlh\" (UniqueName: \"kubernetes.io/projected/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-kube-api-access-qprlh\") pod \"observability-ui-dashboards-7d5fb4cbfb-7x6j5\" (UID: \"4201f396-8ff3-4b7b-82d2-f26cc129b3f9\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.393644 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-7x6j5\" (UID: \"4201f396-8ff3-4b7b-82d2-f26cc129b3f9\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:45 crc kubenswrapper[4702]: E1203 11:26:45.393840 4702 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Dec 03 11:26:45 crc kubenswrapper[4702]: E1203 11:26:45.393928 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-serving-cert podName:4201f396-8ff3-4b7b-82d2-f26cc129b3f9 nodeName:}" failed. No retries permitted until 2025-12-03 11:26:45.893899449 +0000 UTC m=+1389.729827913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-serving-cert") pod "observability-ui-dashboards-7d5fb4cbfb-7x6j5" (UID: "4201f396-8ff3-4b7b-82d2-f26cc129b3f9") : secret "observability-ui-dashboards" not found Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.466705 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qprlh\" (UniqueName: \"kubernetes.io/projected/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-kube-api-access-qprlh\") pod \"observability-ui-dashboards-7d5fb4cbfb-7x6j5\" (UID: \"4201f396-8ff3-4b7b-82d2-f26cc129b3f9\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.585919 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.614246 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.619290 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.619458 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.623490 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.627908 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.628143 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.628311 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-hlcn5" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.653137 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.709006 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.709107 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prkgf\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-kube-api-access-prkgf\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.709187 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.709213 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.709242 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.709264 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.709296 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.709331 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.740201 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.753493 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-74bd8cbfc6-krjzs"] Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.755296 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.791713 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74bd8cbfc6-krjzs"] Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830392 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830445 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830486 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-service-ca\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830535 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830576 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830624 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830645 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-oauth-config\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830682 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-trusted-ca-bundle\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830706 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830743 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-serving-cert\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830808 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-oauth-serving-cert\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830882 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-config\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.830941 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.831006 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4sgp\" (UniqueName: \"kubernetes.io/projected/0248a046-0fe5-47b8-a644-50dfc9a20a75-kube-api-access-c4sgp\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.831046 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prkgf\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-kube-api-access-prkgf\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.831844 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: E1203 11:26:45.864219 4702 configmap.go:193] Couldn't get configMap openstack/prometheus-metric-storage-rulefiles-0: configmap "prometheus-metric-storage-rulefiles-0" not found Dec 03 11:26:45 crc kubenswrapper[4702]: E1203 11:26:45.864554 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0 podName:cf7bd44e-b3b4-4812-b7ee-512fb948d8f9 nodeName:}" failed. No retries permitted until 2025-12-03 11:26:46.36453199 +0000 UTC m=+1390.200460454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-metric-storage-rulefiles-0" (UniqueName: "kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0") pod "prometheus-metric-storage-0" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9") : configmap "prometheus-metric-storage-rulefiles-0" not found Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.864887 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.874378 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.875259 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.886985 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prkgf\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-kube-api-access-prkgf\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.892732 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.895459 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: W1203 11:26:45.908642 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7e1497f_e194_429b_add6_ee8e886fed8b.slice/crio-f93ae10486e3658f980820dffa535cc38d2156232b852825a26939702f9ce0dc WatchSource:0}: Error finding container f93ae10486e3658f980820dffa535cc38d2156232b852825a26939702f9ce0dc: Status 404 returned error can't find the container with id f93ae10486e3658f980820dffa535cc38d2156232b852825a26939702f9ce0dc Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.916427 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.944465 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-config\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.944565 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4sgp\" (UniqueName: \"kubernetes.io/projected/0248a046-0fe5-47b8-a644-50dfc9a20a75-kube-api-access-c4sgp\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.944613 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-7x6j5\" (UID: \"4201f396-8ff3-4b7b-82d2-f26cc129b3f9\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.944648 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-service-ca\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.944697 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-oauth-config\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.944731 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-trusted-ca-bundle\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.944768 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-serving-cert\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.944803 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-oauth-serving-cert\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.946368 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-service-ca\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.947600 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-oauth-serving-cert\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.948564 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-config\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.951580 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0248a046-0fe5-47b8-a644-50dfc9a20a75-trusted-ca-bundle\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.960100 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4201f396-8ff3-4b7b-82d2-f26cc129b3f9-serving-cert\") pod \"observability-ui-dashboards-7d5fb4cbfb-7x6j5\" (UID: \"4201f396-8ff3-4b7b-82d2-f26cc129b3f9\") " pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.961015 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-oauth-config\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.969060 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0248a046-0fe5-47b8-a644-50dfc9a20a75-console-serving-cert\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:45 crc kubenswrapper[4702]: I1203 11:26:45.980949 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4sgp\" (UniqueName: \"kubernetes.io/projected/0248a046-0fe5-47b8-a644-50dfc9a20a75-kube-api-access-c4sgp\") pod \"console-74bd8cbfc6-krjzs\" (UID: \"0248a046-0fe5-47b8-a644-50dfc9a20a75\") " pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:46 crc kubenswrapper[4702]: I1203 11:26:46.191031 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" Dec 03 11:26:46 crc kubenswrapper[4702]: I1203 11:26:46.268351 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:46 crc kubenswrapper[4702]: I1203 11:26:46.393065 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:46 crc kubenswrapper[4702]: I1203 11:26:46.394454 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:46 crc kubenswrapper[4702]: I1203 11:26:46.431039 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7e1497f-e194-429b-add6-ee8e886fed8b","Type":"ContainerStarted","Data":"f93ae10486e3658f980820dffa535cc38d2156232b852825a26939702f9ce0dc"} Dec 03 11:26:46 crc kubenswrapper[4702]: I1203 11:26:46.566201 4702 generic.go:334] "Generic (PLEG): container finished" podID="51a412e7-4275-4445-a558-0db52234278e" containerID="783a44c081b159c829b5a00a3a1612e6eabeac3641bc7cc5b2bbe2d816d47350" exitCode=0 Dec 03 11:26:46 crc kubenswrapper[4702]: I1203 11:26:46.566274 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8j722" event={"ID":"51a412e7-4275-4445-a558-0db52234278e","Type":"ContainerDied","Data":"783a44c081b159c829b5a00a3a1612e6eabeac3641bc7cc5b2bbe2d816d47350"} Dec 03 11:26:46 crc kubenswrapper[4702]: I1203 11:26:46.620463 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.369244 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.372790 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.386472 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.387090 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.387337 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5rw2k" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.388128 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.388286 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.393673 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.468370 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/406550ad-e61e-4af5-a42e-4e1437958f90-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.468430 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/406550ad-e61e-4af5-a42e-4e1437958f90-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.468465 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.468540 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.468571 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.468605 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.468646 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/406550ad-e61e-4af5-a42e-4e1437958f90-config\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.468720 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68pdd\" (UniqueName: \"kubernetes.io/projected/406550ad-e61e-4af5-a42e-4e1437958f90-kube-api-access-68pdd\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.571900 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/406550ad-e61e-4af5-a42e-4e1437958f90-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.571978 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/406550ad-e61e-4af5-a42e-4e1437958f90-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.572015 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.572066 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.572101 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.572139 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.572179 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/406550ad-e61e-4af5-a42e-4e1437958f90-config\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.572250 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68pdd\" (UniqueName: \"kubernetes.io/projected/406550ad-e61e-4af5-a42e-4e1437958f90-kube-api-access-68pdd\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.573310 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/406550ad-e61e-4af5-a42e-4e1437958f90-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.574449 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/406550ad-e61e-4af5-a42e-4e1437958f90-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.670047 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.674011 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/406550ad-e61e-4af5-a42e-4e1437958f90-config\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.679096 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.681661 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.705501 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/406550ad-e61e-4af5-a42e-4e1437958f90-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.736671 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.750163 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74bd8cbfc6-krjzs"] Dec 03 11:26:47 crc kubenswrapper[4702]: I1203 11:26:47.762551 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68pdd\" (UniqueName: \"kubernetes.io/projected/406550ad-e61e-4af5-a42e-4e1437958f90-kube-api-access-68pdd\") pod \"ovsdbserver-nb-0\" (UID: \"406550ad-e61e-4af5-a42e-4e1437958f90\") " pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.005270 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5"] Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.029902 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.278939 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.417886 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.423328 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-catalog-content\") pod \"51a412e7-4275-4445-a558-0db52234278e\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.423463 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv77g\" (UniqueName: \"kubernetes.io/projected/51a412e7-4275-4445-a558-0db52234278e-kube-api-access-qv77g\") pod \"51a412e7-4275-4445-a558-0db52234278e\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.423586 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-utilities\") pod \"51a412e7-4275-4445-a558-0db52234278e\" (UID: \"51a412e7-4275-4445-a558-0db52234278e\") " Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.425037 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-utilities" (OuterVolumeSpecName: "utilities") pod "51a412e7-4275-4445-a558-0db52234278e" (UID: "51a412e7-4275-4445-a558-0db52234278e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.431730 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a412e7-4275-4445-a558-0db52234278e-kube-api-access-qv77g" (OuterVolumeSpecName: "kube-api-access-qv77g") pod "51a412e7-4275-4445-a558-0db52234278e" (UID: "51a412e7-4275-4445-a558-0db52234278e"). InnerVolumeSpecName "kube-api-access-qv77g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.525486 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv77g\" (UniqueName: \"kubernetes.io/projected/51a412e7-4275-4445-a558-0db52234278e-kube-api-access-qv77g\") on node \"crc\" DevicePath \"\"" Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.525995 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.579997 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51a412e7-4275-4445-a558-0db52234278e" (UID: "51a412e7-4275-4445-a558-0db52234278e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:26:48 crc kubenswrapper[4702]: I1203 11:26:48.628634 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a412e7-4275-4445-a558-0db52234278e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.029796 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8j722" Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.051804 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74bd8cbfc6-krjzs" event={"ID":"0248a046-0fe5-47b8-a644-50dfc9a20a75","Type":"ContainerStarted","Data":"a5e78e1ffc66717a2a2b04d5fd70bbd1895b81f70241a87acf4cb7299707996a"} Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.051872 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerStarted","Data":"851d0ed2dccf3a3c4e53f4003c6ba4bbfc6d4a81b381ad8923629bff8358ed09"} Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.051891 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8j722" event={"ID":"51a412e7-4275-4445-a558-0db52234278e","Type":"ContainerDied","Data":"6b7f8c7ca8d806f0c01d886eac462815e4e5deb8a42b60a15792aa4225041d7c"} Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.051913 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" event={"ID":"4201f396-8ff3-4b7b-82d2-f26cc129b3f9","Type":"ContainerStarted","Data":"c02664d6e9ed9cb4eb7736356fcc17df22e60588c7265dbb6394b1997afed43c"} Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.051972 4702 scope.go:117] "RemoveContainer" containerID="783a44c081b159c829b5a00a3a1612e6eabeac3641bc7cc5b2bbe2d816d47350" Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.260539 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8j722"] Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.284727 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8j722"] Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.470709 4702 scope.go:117] "RemoveContainer" containerID="0c8b9c373e7d8e582eea14528ba03533bfeff2d3f2f2967ced0a29933de5332d" Dec 03 11:26:49 crc kubenswrapper[4702]: I1203 11:26:49.730150 4702 scope.go:117] "RemoveContainer" containerID="5d665785b3bbd192b66640e6a1a08b3bb251e693022769edc4b78f7b209dca89" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.426218 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5v7lw"] Dec 03 11:26:50 crc kubenswrapper[4702]: E1203 11:26:50.427627 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="extract-utilities" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.427646 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="extract-utilities" Dec 03 11:26:50 crc kubenswrapper[4702]: E1203 11:26:50.427675 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="extract-content" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.427681 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="extract-content" Dec 03 11:26:50 crc kubenswrapper[4702]: E1203 11:26:50.427695 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="registry-server" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.427701 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="registry-server" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.427917 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a412e7-4275-4445-a558-0db52234278e" containerName="registry-server" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.428782 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.437429 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.437443 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-ppjxl" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.437443 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.446293 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5v7lw"] Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.474448 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-zqdp5"] Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.477639 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.530076 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zqdp5"] Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.575577 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77b1727-1835-42aa-a4f6-d902ff001d20-combined-ca-bundle\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.575658 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-run\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.575709 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77b1727-1835-42aa-a4f6-d902ff001d20-ovn-controller-tls-certs\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.575779 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-lib\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.575865 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-log\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.575890 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-run-ovn\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.575923 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-etc-ovs\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.575978 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh2g6\" (UniqueName: \"kubernetes.io/projected/163cc47f-d241-4b3e-bf62-07f49047de5d-kube-api-access-fh2g6\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.576018 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/163cc47f-d241-4b3e-bf62-07f49047de5d-scripts\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.576050 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2csw\" (UniqueName: \"kubernetes.io/projected/e77b1727-1835-42aa-a4f6-d902ff001d20-kube-api-access-r2csw\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.576214 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-log-ovn\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.576322 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-run\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.576410 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e77b1727-1835-42aa-a4f6-d902ff001d20-scripts\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.679824 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77b1727-1835-42aa-a4f6-d902ff001d20-ovn-controller-tls-certs\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680293 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-lib\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680392 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-run-ovn\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680417 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-log\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680450 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-etc-ovs\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680517 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh2g6\" (UniqueName: \"kubernetes.io/projected/163cc47f-d241-4b3e-bf62-07f49047de5d-kube-api-access-fh2g6\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680770 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/163cc47f-d241-4b3e-bf62-07f49047de5d-scripts\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680827 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2csw\" (UniqueName: \"kubernetes.io/projected/e77b1727-1835-42aa-a4f6-d902ff001d20-kube-api-access-r2csw\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680856 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-log-ovn\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680891 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-run\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.680948 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e77b1727-1835-42aa-a4f6-d902ff001d20-scripts\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.681006 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77b1727-1835-42aa-a4f6-d902ff001d20-combined-ca-bundle\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.681047 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-run\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.681650 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-etc-ovs\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.681798 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-run\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.681876 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-run\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.681910 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-log\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.681956 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-run-ovn\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.682253 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/163cc47f-d241-4b3e-bf62-07f49047de5d-var-lib\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.682885 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e77b1727-1835-42aa-a4f6-d902ff001d20-var-log-ovn\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.686287 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/163cc47f-d241-4b3e-bf62-07f49047de5d-scripts\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.687702 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e77b1727-1835-42aa-a4f6-d902ff001d20-combined-ca-bundle\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.688429 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/e77b1727-1835-42aa-a4f6-d902ff001d20-ovn-controller-tls-certs\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.692276 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e77b1727-1835-42aa-a4f6-d902ff001d20-scripts\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.721382 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh2g6\" (UniqueName: \"kubernetes.io/projected/163cc47f-d241-4b3e-bf62-07f49047de5d-kube-api-access-fh2g6\") pod \"ovn-controller-ovs-zqdp5\" (UID: \"163cc47f-d241-4b3e-bf62-07f49047de5d\") " pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.746055 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2csw\" (UniqueName: \"kubernetes.io/projected/e77b1727-1835-42aa-a4f6-d902ff001d20-kube-api-access-r2csw\") pod \"ovn-controller-5v7lw\" (UID: \"e77b1727-1835-42aa-a4f6-d902ff001d20\") " pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.763302 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.805586 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:26:50 crc kubenswrapper[4702]: I1203 11:26:50.982818 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a412e7-4275-4445-a558-0db52234278e" path="/var/lib/kubelet/pods/51a412e7-4275-4445-a558-0db52234278e/volumes" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.053927 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.195041 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74bd8cbfc6-krjzs" event={"ID":"0248a046-0fe5-47b8-a644-50dfc9a20a75","Type":"ContainerStarted","Data":"8caba6d76558cecf33ee31cf372c564e4c3e540b0ff448676f793f51cad046c9"} Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.230865 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74bd8cbfc6-krjzs" podStartSLOduration=6.230816656 podStartE2EDuration="6.230816656s" podCreationTimestamp="2025-12-03 11:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:26:51.216578943 +0000 UTC m=+1395.052507427" watchObservedRunningTime="2025-12-03 11:26:51.230816656 +0000 UTC m=+1395.066745140" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.325182 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.332557 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.335742 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.335789 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-d9qt2" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.336379 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.336717 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.374944 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.404820 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.404877 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.404921 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgnw6\" (UniqueName: \"kubernetes.io/projected/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-kube-api-access-cgnw6\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.404956 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.404992 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.405057 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.405178 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.405227 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.506903 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.506986 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.507046 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.507071 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.507087 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgnw6\" (UniqueName: \"kubernetes.io/projected/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-kube-api-access-cgnw6\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.507111 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.507136 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.507187 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.507622 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.507972 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.508181 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-config\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.508662 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.528911 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.532097 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgnw6\" (UniqueName: \"kubernetes.io/projected/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-kube-api-access-cgnw6\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.533945 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.541651 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd806fd6-9deb-4a6d-8e73-e486e1b2cba7-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.546451 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7\") " pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.625906 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5v7lw"] Dec 03 11:26:51 crc kubenswrapper[4702]: I1203 11:26:51.702742 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.001836 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-zqdp5"] Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.100941 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rwfzx"] Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.102710 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.105973 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.121076 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rwfzx"] Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.125909 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59e4fe73-960d-4021-bbda-ce3ba11e72be-ovs-rundir\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.125957 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59e4fe73-960d-4021-bbda-ce3ba11e72be-ovn-rundir\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.125978 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e4fe73-960d-4021-bbda-ce3ba11e72be-combined-ca-bundle\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.126058 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59e4fe73-960d-4021-bbda-ce3ba11e72be-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.126081 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq7v6\" (UniqueName: \"kubernetes.io/projected/59e4fe73-960d-4021-bbda-ce3ba11e72be-kube-api-access-bq7v6\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.126102 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e4fe73-960d-4021-bbda-ce3ba11e72be-config\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.230213 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59e4fe73-960d-4021-bbda-ce3ba11e72be-ovs-rundir\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.230283 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59e4fe73-960d-4021-bbda-ce3ba11e72be-ovn-rundir\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.230319 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e4fe73-960d-4021-bbda-ce3ba11e72be-combined-ca-bundle\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.230456 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59e4fe73-960d-4021-bbda-ce3ba11e72be-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.230492 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq7v6\" (UniqueName: \"kubernetes.io/projected/59e4fe73-960d-4021-bbda-ce3ba11e72be-kube-api-access-bq7v6\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.230532 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e4fe73-960d-4021-bbda-ce3ba11e72be-config\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.233318 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e4fe73-960d-4021-bbda-ce3ba11e72be-config\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.233639 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/59e4fe73-960d-4021-bbda-ce3ba11e72be-ovs-rundir\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.233699 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/59e4fe73-960d-4021-bbda-ce3ba11e72be-ovn-rundir\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.241874 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59e4fe73-960d-4021-bbda-ce3ba11e72be-combined-ca-bundle\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.243628 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/59e4fe73-960d-4021-bbda-ce3ba11e72be-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.255020 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"406550ad-e61e-4af5-a42e-4e1437958f90","Type":"ContainerStarted","Data":"c57a956f454084d49fe16fbaf85b4f2878988b3dd46059b239c5c3fbb7a241c3"} Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.265194 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq7v6\" (UniqueName: \"kubernetes.io/projected/59e4fe73-960d-4021-bbda-ce3ba11e72be-kube-api-access-bq7v6\") pod \"ovn-controller-metrics-rwfzx\" (UID: \"59e4fe73-960d-4021-bbda-ce3ba11e72be\") " pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.302509 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2plx"] Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.365132 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jh4g"] Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.368513 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.373741 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.388683 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jh4g"] Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.436047 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.436139 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.436183 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m86xq\" (UniqueName: \"kubernetes.io/projected/83e06f10-c712-4280-8d28-e8b44ac0810c-kube-api-access-m86xq\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.436269 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-config\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.439025 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rwfzx" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.538322 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-config\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.539902 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.539986 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.540010 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m86xq\" (UniqueName: \"kubernetes.io/projected/83e06f10-c712-4280-8d28-e8b44ac0810c-kube-api-access-m86xq\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.542534 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-config\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.543160 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.545549 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.588503 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m86xq\" (UniqueName: \"kubernetes.io/projected/83e06f10-c712-4280-8d28-e8b44ac0810c-kube-api-access-m86xq\") pod \"dnsmasq-dns-7fd796d7df-4jh4g\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:52 crc kubenswrapper[4702]: I1203 11:26:52.699979 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:26:53 crc kubenswrapper[4702]: I1203 11:26:53.311046 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw" event={"ID":"e77b1727-1835-42aa-a4f6-d902ff001d20","Type":"ContainerStarted","Data":"ee7d177582a6b061babd95b0851748af0544a4bad909ddeac6f244f7fba9abb2"} Dec 03 11:26:53 crc kubenswrapper[4702]: I1203 11:26:53.335992 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zqdp5" event={"ID":"163cc47f-d241-4b3e-bf62-07f49047de5d","Type":"ContainerStarted","Data":"a4c8292cd9a77f4e4c18ce6dbb25ab2ef3bd2de677329d6456eaa7cd8c9d6628"} Dec 03 11:26:56 crc kubenswrapper[4702]: I1203 11:26:56.269937 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:56 crc kubenswrapper[4702]: I1203 11:26:56.270303 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:56 crc kubenswrapper[4702]: I1203 11:26:56.274967 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:56 crc kubenswrapper[4702]: I1203 11:26:56.590896 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74bd8cbfc6-krjzs" Dec 03 11:26:56 crc kubenswrapper[4702]: I1203 11:26:56.668792 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-769cf4984d-hvjnw"] Dec 03 11:26:59 crc kubenswrapper[4702]: I1203 11:26:59.177741 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 11:27:02 crc kubenswrapper[4702]: W1203 11:27:02.681417 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd806fd6_9deb_4a6d_8e73_e486e1b2cba7.slice/crio-3150f68899558ef76b54cee721738962c985125e410460091d647b19c16d8bed WatchSource:0}: Error finding container 3150f68899558ef76b54cee721738962c985125e410460091d647b19c16d8bed: Status 404 returned error can't find the container with id 3150f68899558ef76b54cee721738962c985125e410460091d647b19c16d8bed Dec 03 11:27:03 crc kubenswrapper[4702]: I1203 11:27:03.210530 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rwfzx"] Dec 03 11:27:03 crc kubenswrapper[4702]: I1203 11:27:03.669009 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7","Type":"ContainerStarted","Data":"3150f68899558ef76b54cee721738962c985125e410460091d647b19c16d8bed"} Dec 03 11:27:05 crc kubenswrapper[4702]: E1203 11:27:05.317689 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb" Dec 03 11:27:05 crc kubenswrapper[4702]: E1203 11:27:05.318384 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:observability-ui-dashboards,Image:registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb,Command:[],Args:[-port=9443 -cert=/var/serving-cert/tls.crt -key=/var/serving-cert/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serving-cert,ReadOnly:true,MountPath:/var/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qprlh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod observability-ui-dashboards-7d5fb4cbfb-7x6j5_openshift-operators(4201f396-8ff3-4b7b-82d2-f26cc129b3f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:27:05 crc kubenswrapper[4702]: E1203 11:27:05.319702 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"observability-ui-dashboards\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" podUID="4201f396-8ff3-4b7b-82d2-f26cc129b3f9" Dec 03 11:27:05 crc kubenswrapper[4702]: I1203 11:27:05.690736 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rwfzx" event={"ID":"59e4fe73-960d-4021-bbda-ce3ba11e72be","Type":"ContainerStarted","Data":"403b8b674d210f65912029295f103f9647dfb8e4e1e2364ac18dfd6b3f4f59d8"} Dec 03 11:27:05 crc kubenswrapper[4702]: E1203 11:27:05.692537 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"observability-ui-dashboards\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/dashboards-console-plugin-rhel9@sha256:a69da8bbca8a28dd2925f864d51cc31cf761b10532c553095ba40b242ef701cb\\\"\"" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" podUID="4201f396-8ff3-4b7b-82d2-f26cc129b3f9" Dec 03 11:27:21 crc kubenswrapper[4702]: E1203 11:27:21.141937 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 11:27:21 crc kubenswrapper[4702]: E1203 11:27:21.142645 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-t6cls,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(92266ac3-f0a6-4e68-9e88-9aa2900e1fe3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:21 crc kubenswrapper[4702]: E1203 11:27:21.143842 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" Dec 03 11:27:21 crc kubenswrapper[4702]: I1203 11:27:21.782672 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-769cf4984d-hvjnw" podUID="96a2e67f-2c70-4ac9-992f-b19afb498a1f" containerName="console" containerID="cri-o://8f133bb4e9ce8c7b68661c304c99af01d55c46b4502326c9667f89fc48aa1724" gracePeriod=15 Dec 03 11:27:21 crc kubenswrapper[4702]: E1203 11:27:21.882151 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.122386 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-controller/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\": context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.123200 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovn-controller,Image:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,Command:[ovn-controller --pidfile unix:/run/openvswitch/db.sock --certificate=/etc/pki/tls/certs/ovndb.crt --private-key=/etc/pki/tls/private/ovndb.key --ca-cert=/etc/pki/tls/certs/ovndbca.crt],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8fh5d6h5f6hc9hc6h585h5bfhd4h694h95h5d6hdh589h67hf9h56bh695hcfh576hbh666h55fh58h5d4h58bh85h5b8h64dhfbh698h596h9bq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run-ovn,ReadOnly:false,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log-ovn,ReadOnly:false,MountPath:/var/log/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-controller-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r2csw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_liveness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/ovn_controller_readiness.sh],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/share/ovn/scripts/ovn-ctl stop_controller],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-5v7lw_openstack(e77b1727-1835-42aa-a4f6-d902ff001d20): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-controller/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\": context canceled" logger="UnhandledError" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.124486 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \\\"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-controller/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\\\": context canceled\"" pod="openstack/ovn-controller-5v7lw" podUID="e77b1727-1835-42aa-a4f6-d902ff001d20" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.154399 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:1e02d32990adc4dad7c8927f91cca33a1baba746105504093311eb3b0b691fa0: Get \"https://quay.io/v2/openstack-k8s-operators/openstack-network-exporter/blobs/sha256:1e02d32990adc4dad7c8927f91cca33a1baba746105504093311eb3b0b691fa0\": context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.154664 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},EnvVar{Name:CONFIG_HASH,Value:nf9h54h644h545h5c6h86hbh65fh54bhfbh5fch577h65ch664h578h98h5b6h569h58ch554h5d9h565hch5d8h4hb5h557h74h5c9hb7h556h5f6q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovs-rundir,ReadOnly:true,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:true,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bq7v6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-metrics-rwfzx_openstack(59e4fe73-960d-4021-bbda-ce3ba11e72be): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:1e02d32990adc4dad7c8927f91cca33a1baba746105504093311eb3b0b691fa0: Get \"https://quay.io/v2/openstack-k8s-operators/openstack-network-exporter/blobs/sha256:1e02d32990adc4dad7c8927f91cca33a1baba746105504093311eb3b0b691fa0\": context canceled" logger="UnhandledError" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.156047 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:1e02d32990adc4dad7c8927f91cca33a1baba746105504093311eb3b0b691fa0: Get \\\"https://quay.io/v2/openstack-k8s-operators/openstack-network-exporter/blobs/sha256:1e02d32990adc4dad7c8927f91cca33a1baba746105504093311eb3b0b691fa0\\\": context canceled\"" pod="openstack/ovn-controller-metrics-rwfzx" podUID="59e4fe73-960d-4021-bbda-ce3ba11e72be" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.157435 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.157641 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m265r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.158773 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.175373 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-nb-db-server/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\": context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.175611 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n587h5ffh74h6bh594h5c8h685h58bh589h578h5d9h585h55chc8hd6h5dch57ch598h5ddh67bh66fhcch687h649h5d4h5b6hbch5c7h58h68ch84h4q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68pdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(406550ad-e61e-4af5-a42e-4e1437958f90): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-nb-db-server/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\": context canceled" logger="UnhandledError" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.826977 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-sb-db-server/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\": context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.827200 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-sb,Image:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5c7h65bh5dch674h698h9dh576h54h5cch566h64ch54ch576h5cfh65h669hc5h64bhc8h5f7h5bfhbbh585h545h8dh54fh8h5fbh655hd5h68bhf9q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-sb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cgnw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-sb-0_openstack(bd806fd6-9deb-4a6d-8e73-e486e1b2cba7): ErrImagePull: rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-sb-db-server/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\": context canceled" logger="UnhandledError" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.833524 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.833780 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nb6h586h668h58ch55fh555h64dh69h577hf6h5c7h74hbhd5h5d4h657h64h9dh565h567h5f5h5dbh5c8h58bh5ddh58ch58ch597hc5h57chdfh8bq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lmc75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(8ea851b4-124d-4472-9fd0-7b584da44ecc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.835094 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="8ea851b4-124d-4472-9fd0-7b584da44ecc" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.869529 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.869747 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4rddh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(c91e1dc8-ef80-407f-ac34-4c9ab29026f7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.871107 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" Dec 03 11:27:22 crc kubenswrapper[4702]: I1203 11:27:22.897535 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-769cf4984d-hvjnw_96a2e67f-2c70-4ac9-992f-b19afb498a1f/console/0.log" Dec 03 11:27:22 crc kubenswrapper[4702]: I1203 11:27:22.897603 4702 generic.go:334] "Generic (PLEG): container finished" podID="96a2e67f-2c70-4ac9-992f-b19afb498a1f" containerID="8f133bb4e9ce8c7b68661c304c99af01d55c46b4502326c9667f89fc48aa1724" exitCode=2 Dec 03 11:27:22 crc kubenswrapper[4702]: I1203 11:27:22.898815 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769cf4984d-hvjnw" event={"ID":"96a2e67f-2c70-4ac9-992f-b19afb498a1f","Type":"ContainerDied","Data":"8f133bb4e9ce8c7b68661c304c99af01d55c46b4502326c9667f89fc48aa1724"} Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.907855 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.908006 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="8ea851b4-124d-4472-9fd0-7b584da44ecc" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.908045 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"" pod="openstack/ovn-controller-metrics-rwfzx" podUID="59e4fe73-960d-4021-bbda-ce3ba11e72be" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.907968 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.908143 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovn-controller\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified\\\"\"" pod="openstack/ovn-controller-5v7lw" podUID="e77b1727-1835-42aa-a4f6-d902ff001d20" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.925793 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.925977 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2jfm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(85f53e1b-50d1-4249-ba44-5b2e5982ae36): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:22 crc kubenswrapper[4702]: E1203 11:27:22.927219 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" Dec 03 11:27:23 crc kubenswrapper[4702]: I1203 11:27:23.107671 4702 patch_prober.go:28] interesting pod/console-769cf4984d-hvjnw container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.89:8443/health\": dial tcp 10.217.0.89:8443: connect: connection refused" start-of-body= Dec 03 11:27:23 crc kubenswrapper[4702]: I1203 11:27:23.108396 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-769cf4984d-hvjnw" podUID="96a2e67f-2c70-4ac9-992f-b19afb498a1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.89:8443/health\": dial tcp 10.217.0.89:8443: connect: connection refused" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.689665 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.689941 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9djd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-f6zjj_openstack(f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.691144 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" podUID="f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.778410 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.778602 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7s2fp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-9z8nk_openstack(68d6c68f-856a-48b5-8ca3-e1165b430d65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.779813 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.793407 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.793599 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9fzfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-6p7s4_openstack(41674587-555a-4676-bd26-6732bdbb594b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.795001 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" podUID="41674587-555a-4676-bd26-6732bdbb594b" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.816957 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.817183 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hqxm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-t2plx_openstack(da1dab47-7e05-48f7-84dc-7747cfa50aa8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.818390 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" podUID="da1dab47-7e05-48f7-84dc-7747cfa50aa8" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.911228 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" Dec 03 11:27:23 crc kubenswrapper[4702]: E1203 11:27:23.911228 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" Dec 03 11:27:31 crc kubenswrapper[4702]: E1203 11:27:31.502842 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Dec 03 11:27:31 crc kubenswrapper[4702]: E1203 11:27:31.504450 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8fh5d6h5f6hc9hc6h585h5bfhd4h694h95h5d6hdh589h67hf9h56bh695hcfh576hbh666h55fh58h5d4h58bh85h5b8h64dhfbh698h596h9bq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fh2g6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-zqdp5_openstack(163cc47f-d241-4b3e-bf62-07f49047de5d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:27:31 crc kubenswrapper[4702]: E1203 11:27:31.505793 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-zqdp5" podUID="163cc47f-d241-4b3e-bf62-07f49047de5d" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.685145 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.693166 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.712152 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41674587-555a-4676-bd26-6732bdbb594b-config\") pod \"41674587-555a-4676-bd26-6732bdbb594b\" (UID: \"41674587-555a-4676-bd26-6732bdbb594b\") " Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.712452 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fzfg\" (UniqueName: \"kubernetes.io/projected/41674587-555a-4676-bd26-6732bdbb594b-kube-api-access-9fzfg\") pod \"41674587-555a-4676-bd26-6732bdbb594b\" (UID: \"41674587-555a-4676-bd26-6732bdbb594b\") " Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.713939 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41674587-555a-4676-bd26-6732bdbb594b-config" (OuterVolumeSpecName: "config") pod "41674587-555a-4676-bd26-6732bdbb594b" (UID: "41674587-555a-4676-bd26-6732bdbb594b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.723409 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41674587-555a-4676-bd26-6732bdbb594b-kube-api-access-9fzfg" (OuterVolumeSpecName: "kube-api-access-9fzfg") pod "41674587-555a-4676-bd26-6732bdbb594b" (UID: "41674587-555a-4676-bd26-6732bdbb594b"). InnerVolumeSpecName "kube-api-access-9fzfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.723861 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.815168 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-dns-svc\") pod \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.815257 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-config\") pod \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.815314 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqxm5\" (UniqueName: \"kubernetes.io/projected/da1dab47-7e05-48f7-84dc-7747cfa50aa8-kube-api-access-hqxm5\") pod \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\" (UID: \"da1dab47-7e05-48f7-84dc-7747cfa50aa8\") " Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.815351 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9djd8\" (UniqueName: \"kubernetes.io/projected/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-kube-api-access-9djd8\") pod \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.815398 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-dns-svc\") pod \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.815499 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-config\") pod \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\" (UID: \"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84\") " Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.815688 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da1dab47-7e05-48f7-84dc-7747cfa50aa8" (UID: "da1dab47-7e05-48f7-84dc-7747cfa50aa8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.816172 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.816201 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fzfg\" (UniqueName: \"kubernetes.io/projected/41674587-555a-4676-bd26-6732bdbb594b-kube-api-access-9fzfg\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.816218 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/41674587-555a-4676-bd26-6732bdbb594b-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.816414 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-config" (OuterVolumeSpecName: "config") pod "da1dab47-7e05-48f7-84dc-7747cfa50aa8" (UID: "da1dab47-7e05-48f7-84dc-7747cfa50aa8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.817177 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84" (UID: "f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.817259 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-config" (OuterVolumeSpecName: "config") pod "f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84" (UID: "f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.820115 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-kube-api-access-9djd8" (OuterVolumeSpecName: "kube-api-access-9djd8") pod "f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84" (UID: "f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84"). InnerVolumeSpecName "kube-api-access-9djd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.825076 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1dab47-7e05-48f7-84dc-7747cfa50aa8-kube-api-access-hqxm5" (OuterVolumeSpecName: "kube-api-access-hqxm5") pod "da1dab47-7e05-48f7-84dc-7747cfa50aa8" (UID: "da1dab47-7e05-48f7-84dc-7747cfa50aa8"). InnerVolumeSpecName "kube-api-access-hqxm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.918551 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da1dab47-7e05-48f7-84dc-7747cfa50aa8-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.918976 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqxm5\" (UniqueName: \"kubernetes.io/projected/da1dab47-7e05-48f7-84dc-7747cfa50aa8-kube-api-access-hqxm5\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.918994 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9djd8\" (UniqueName: \"kubernetes.io/projected/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-kube-api-access-9djd8\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.919006 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:31 crc kubenswrapper[4702]: I1203 11:27:31.919020 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:31.999574 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" event={"ID":"41674587-555a-4676-bd26-6732bdbb594b","Type":"ContainerDied","Data":"ae691aa75a50ca76e3a2f18463e5581eac952b0f12320a755a338b7d751b2ef9"} Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:31.999672 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-6p7s4" Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.013427 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" event={"ID":"f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84","Type":"ContainerDied","Data":"bc69edc23de4e207e0a2102c082b674d105e7b86e73cde533e54c7a589e80221"} Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.016704 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.016706 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-t2plx" event={"ID":"da1dab47-7e05-48f7-84dc-7747cfa50aa8","Type":"ContainerDied","Data":"89462ce7db5dd805303fab58c2356a286025883a87b1ca94f4f32e186bdc42d0"} Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.016861 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-f6zjj" Dec 03 11:27:32 crc kubenswrapper[4702]: E1203 11:27:32.018169 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-zqdp5" podUID="163cc47f-d241-4b3e-bf62-07f49047de5d" Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.109518 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6p7s4"] Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.120962 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-6p7s4"] Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.149692 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2plx"] Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.159435 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-t2plx"] Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.180790 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6zjj"] Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.192967 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-f6zjj"] Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.201694 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jh4g"] Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.943465 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41674587-555a-4676-bd26-6732bdbb594b" path="/var/lib/kubelet/pods/41674587-555a-4676-bd26-6732bdbb594b/volumes" Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.944028 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1dab47-7e05-48f7-84dc-7747cfa50aa8" path="/var/lib/kubelet/pods/da1dab47-7e05-48f7-84dc-7747cfa50aa8/volumes" Dec 03 11:27:32 crc kubenswrapper[4702]: I1203 11:27:32.944532 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84" path="/var/lib/kubelet/pods/f15ed5bd-2f9e-4033-9c6f-3471c0c8ac84/volumes" Dec 03 11:27:33 crc kubenswrapper[4702]: W1203 11:27:33.877738 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83e06f10_c712_4280_8d28_e8b44ac0810c.slice/crio-ce621d67b7d1b231de17cfee583521bcd32f877ac1fe68b6a39512c80cef8be3 WatchSource:0}: Error finding container ce621d67b7d1b231de17cfee583521bcd32f877ac1fe68b6a39512c80cef8be3: Status 404 returned error can't find the container with id ce621d67b7d1b231de17cfee583521bcd32f877ac1fe68b6a39512c80cef8be3 Dec 03 11:27:33 crc kubenswrapper[4702]: I1203 11:27:33.960599 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-769cf4984d-hvjnw_96a2e67f-2c70-4ac9-992f-b19afb498a1f/console/0.log" Dec 03 11:27:33 crc kubenswrapper[4702]: I1203 11:27:33.960699 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.046223 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" event={"ID":"83e06f10-c712-4280-8d28-e8b44ac0810c","Type":"ContainerStarted","Data":"ce621d67b7d1b231de17cfee583521bcd32f877ac1fe68b6a39512c80cef8be3"} Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.048030 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-769cf4984d-hvjnw_96a2e67f-2c70-4ac9-992f-b19afb498a1f/console/0.log" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.048091 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-769cf4984d-hvjnw" event={"ID":"96a2e67f-2c70-4ac9-992f-b19afb498a1f","Type":"ContainerDied","Data":"161c87e3123360094b0f36b55228092535fdf10a8c53fdf02d94c214d53d3864"} Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.048153 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-769cf4984d-hvjnw" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.048167 4702 scope.go:117] "RemoveContainer" containerID="8f133bb4e9ce8c7b68661c304c99af01d55c46b4502326c9667f89fc48aa1724" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.072818 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-service-ca\") pod \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.072881 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-oauth-config\") pod \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.072923 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-config\") pod \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.072963 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-serving-cert\") pod \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.073074 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk49h\" (UniqueName: \"kubernetes.io/projected/96a2e67f-2c70-4ac9-992f-b19afb498a1f-kube-api-access-tk49h\") pod \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.073169 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-oauth-serving-cert\") pod \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.073235 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-trusted-ca-bundle\") pod \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\" (UID: \"96a2e67f-2c70-4ac9-992f-b19afb498a1f\") " Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.074030 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-config" (OuterVolumeSpecName: "console-config") pod "96a2e67f-2c70-4ac9-992f-b19afb498a1f" (UID: "96a2e67f-2c70-4ac9-992f-b19afb498a1f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.074051 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "96a2e67f-2c70-4ac9-992f-b19afb498a1f" (UID: "96a2e67f-2c70-4ac9-992f-b19afb498a1f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.074061 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-service-ca" (OuterVolumeSpecName: "service-ca") pod "96a2e67f-2c70-4ac9-992f-b19afb498a1f" (UID: "96a2e67f-2c70-4ac9-992f-b19afb498a1f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.074292 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "96a2e67f-2c70-4ac9-992f-b19afb498a1f" (UID: "96a2e67f-2c70-4ac9-992f-b19afb498a1f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.080034 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "96a2e67f-2c70-4ac9-992f-b19afb498a1f" (UID: "96a2e67f-2c70-4ac9-992f-b19afb498a1f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.080288 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a2e67f-2c70-4ac9-992f-b19afb498a1f-kube-api-access-tk49h" (OuterVolumeSpecName: "kube-api-access-tk49h") pod "96a2e67f-2c70-4ac9-992f-b19afb498a1f" (UID: "96a2e67f-2c70-4ac9-992f-b19afb498a1f"). InnerVolumeSpecName "kube-api-access-tk49h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.080788 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "96a2e67f-2c70-4ac9-992f-b19afb498a1f" (UID: "96a2e67f-2c70-4ac9-992f-b19afb498a1f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.107314 4702 patch_prober.go:28] interesting pod/console-769cf4984d-hvjnw container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.89:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.107383 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-769cf4984d-hvjnw" podUID="96a2e67f-2c70-4ac9-992f-b19afb498a1f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.89:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.176076 4702 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.176113 4702 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.176123 4702 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.176132 4702 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.176140 4702 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/96a2e67f-2c70-4ac9-992f-b19afb498a1f-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.176151 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk49h\" (UniqueName: \"kubernetes.io/projected/96a2e67f-2c70-4ac9-992f-b19afb498a1f-kube-api-access-tk49h\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.176161 4702 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/96a2e67f-2c70-4ac9-992f-b19afb498a1f-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.388225 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-769cf4984d-hvjnw"] Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.396007 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-769cf4984d-hvjnw"] Dec 03 11:27:34 crc kubenswrapper[4702]: E1203 11:27:34.452907 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 03 11:27:34 crc kubenswrapper[4702]: E1203 11:27:34.453003 4702 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Dec 03 11:27:34 crc kubenswrapper[4702]: E1203 11:27:34.453205 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4h9k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(d7e1497f-e194-429b-add6-ee8e886fed8b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:27:34 crc kubenswrapper[4702]: E1203 11:27:34.454434 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="d7e1497f-e194-429b-add6-ee8e886fed8b" Dec 03 11:27:34 crc kubenswrapper[4702]: I1203 11:27:34.943612 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a2e67f-2c70-4ac9-992f-b19afb498a1f" path="/var/lib/kubelet/pods/96a2e67f-2c70-4ac9-992f-b19afb498a1f/volumes" Dec 03 11:27:35 crc kubenswrapper[4702]: E1203 11:27:35.091699 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="d7e1497f-e194-429b-add6-ee8e886fed8b" Dec 03 11:27:35 crc kubenswrapper[4702]: E1203 11:27:35.484627 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \\\"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-nb-db-server/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\\\": context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="406550ad-e61e-4af5-a42e-4e1437958f90" Dec 03 11:27:35 crc kubenswrapper[4702]: E1203 11:27:35.493894 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-sb\" with ErrImagePull: \"rpc error: code = Canceled desc = reading blob sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659: Get \\\"https://quay.io/v2/podified-antelope-centos9/openstack-ovn-sb-db-server/blobs/sha256:650ba1d1fb09dfb2a96481f2ebae84fbae4b3474d057dd6d1569ce094dc41659\\\": context canceled\"" pod="openstack/ovsdbserver-sb-0" podUID="bd806fd6-9deb-4a6d-8e73-e486e1b2cba7" Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.092033 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3","Type":"ContainerStarted","Data":"8affc06ec563ea3872435075db71a24285dd187ff576b93bcb48ab670b0e4a15"} Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.097686 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" event={"ID":"4201f396-8ff3-4b7b-82d2-f26cc129b3f9","Type":"ContainerStarted","Data":"393c366dc4a18a3abdc21eec46eebbe4a1c8579ac76a69bfe3db9615d7b27a55"} Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.102053 4702 generic.go:334] "Generic (PLEG): container finished" podID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerID="3bd4b710d4994e2a6d1881641c937942c1d2a9d9b52a68a171ff6690528475a6" exitCode=0 Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.103046 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" event={"ID":"83e06f10-c712-4280-8d28-e8b44ac0810c","Type":"ContainerDied","Data":"3bd4b710d4994e2a6d1881641c937942c1d2a9d9b52a68a171ff6690528475a6"} Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.105077 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"8ea851b4-124d-4472-9fd0-7b584da44ecc","Type":"ContainerStarted","Data":"b9134e491b2dbdb6d2212153bb349cfe82fadeb3e7bc36720dd9616f565ed807"} Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.105323 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.110599 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7","Type":"ContainerStarted","Data":"c2cd2d8c8d10a9f835d86eefdf234df0464e58de68e87774ba32220081fabbb0"} Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.112331 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"406550ad-e61e-4af5-a42e-4e1437958f90","Type":"ContainerStarted","Data":"dc316b57b5210b4f27bd7e9a9cbb5fe350f8d6ccc19f3099b1f88cd8111443a5"} Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.204143 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.417946347 podStartE2EDuration="55.204090642s" podCreationTimestamp="2025-12-03 11:26:41 +0000 UTC" firstStartedPulling="2025-12-03 11:26:43.189805571 +0000 UTC m=+1387.025734025" lastFinishedPulling="2025-12-03 11:27:34.975949856 +0000 UTC m=+1438.811878320" observedRunningTime="2025-12-03 11:27:36.19449948 +0000 UTC m=+1440.030427944" watchObservedRunningTime="2025-12-03 11:27:36.204090642 +0000 UTC m=+1440.040019106" Dec 03 11:27:36 crc kubenswrapper[4702]: I1203 11:27:36.226074 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7d5fb4cbfb-7x6j5" podStartSLOduration=7.487816208 podStartE2EDuration="51.226047103s" podCreationTimestamp="2025-12-03 11:26:45 +0000 UTC" firstStartedPulling="2025-12-03 11:26:48.482044379 +0000 UTC m=+1392.317972843" lastFinishedPulling="2025-12-03 11:27:32.220275274 +0000 UTC m=+1436.056203738" observedRunningTime="2025-12-03 11:27:36.213820147 +0000 UTC m=+1440.049748611" watchObservedRunningTime="2025-12-03 11:27:36.226047103 +0000 UTC m=+1440.061975567" Dec 03 11:27:37 crc kubenswrapper[4702]: I1203 11:27:37.126569 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c91e1dc8-ef80-407f-ac34-4c9ab29026f7","Type":"ContainerStarted","Data":"f61b588e9a863fe95560ecb766779f12b517ef053b82188e823fef6a9863871f"} Dec 03 11:27:37 crc kubenswrapper[4702]: I1203 11:27:37.129636 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" event={"ID":"83e06f10-c712-4280-8d28-e8b44ac0810c","Type":"ContainerStarted","Data":"11bbef9daf6038bfee82c7bba48d9a26add9d4301b0815055b9da4b1a3a4c319"} Dec 03 11:27:37 crc kubenswrapper[4702]: I1203 11:27:37.130135 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:27:37 crc kubenswrapper[4702]: I1203 11:27:37.175489 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" podStartSLOduration=44.082475897 podStartE2EDuration="45.175404824s" podCreationTimestamp="2025-12-03 11:26:52 +0000 UTC" firstStartedPulling="2025-12-03 11:27:33.88303553 +0000 UTC m=+1437.718963994" lastFinishedPulling="2025-12-03 11:27:34.975964457 +0000 UTC m=+1438.811892921" observedRunningTime="2025-12-03 11:27:37.171146613 +0000 UTC m=+1441.007075077" watchObservedRunningTime="2025-12-03 11:27:37.175404824 +0000 UTC m=+1441.011333288" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.152626 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rwfzx" event={"ID":"59e4fe73-960d-4021-bbda-ce3ba11e72be","Type":"ContainerStarted","Data":"22964eacd12adcc7e900e13d3dcb9db7cccc315c2bc1eff479ca8913c22f04c7"} Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.155848 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerStarted","Data":"0cdd14cd1c46fed0871c7e58f04bc827118bac9d8a9e97af12dfaff332296b46"} Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.162701 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"406550ad-e61e-4af5-a42e-4e1437958f90","Type":"ContainerStarted","Data":"035abe11a2f55a0be00517822e136e44890cb0e8b77ca56ebb563df782439eae"} Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.168136 4702 generic.go:334] "Generic (PLEG): container finished" podID="68d6c68f-856a-48b5-8ca3-e1165b430d65" containerID="c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869" exitCode=0 Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.168225 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" event={"ID":"68d6c68f-856a-48b5-8ca3-e1165b430d65","Type":"ContainerDied","Data":"c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869"} Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.171411 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"bd806fd6-9deb-4a6d-8e73-e486e1b2cba7","Type":"ContainerStarted","Data":"8f3e4f1f88e26b6102664aa095d3cfe3a49aa24e9fc6f7affb416a85668cd944"} Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.185024 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rwfzx" podStartSLOduration=-9223371989.669786 podStartE2EDuration="47.184990293s" podCreationTimestamp="2025-12-03 11:26:52 +0000 UTC" firstStartedPulling="2025-12-03 11:27:05.347321373 +0000 UTC m=+1409.183249837" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:27:39.172211871 +0000 UTC m=+1443.008140335" watchObservedRunningTime="2025-12-03 11:27:39.184990293 +0000 UTC m=+1443.020918757" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.213368 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=13.800462309 podStartE2EDuration="49.213337176s" podCreationTimestamp="2025-12-03 11:26:50 +0000 UTC" firstStartedPulling="2025-12-03 11:27:02.686788066 +0000 UTC m=+1406.522716530" lastFinishedPulling="2025-12-03 11:27:38.099662933 +0000 UTC m=+1441.935591397" observedRunningTime="2025-12-03 11:27:39.201320245 +0000 UTC m=+1443.037248709" watchObservedRunningTime="2025-12-03 11:27:39.213337176 +0000 UTC m=+1443.049265640" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.301936 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.5266716989999995 podStartE2EDuration="53.301894554s" podCreationTimestamp="2025-12-03 11:26:46 +0000 UTC" firstStartedPulling="2025-12-03 11:26:51.325824977 +0000 UTC m=+1395.161753441" lastFinishedPulling="2025-12-03 11:27:38.101047832 +0000 UTC m=+1441.936976296" observedRunningTime="2025-12-03 11:27:39.297447998 +0000 UTC m=+1443.133376552" watchObservedRunningTime="2025-12-03 11:27:39.301894554 +0000 UTC m=+1443.137823018" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.533315 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9z8nk"] Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.573237 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-dlgdr"] Dec 03 11:27:39 crc kubenswrapper[4702]: E1203 11:27:39.573968 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a2e67f-2c70-4ac9-992f-b19afb498a1f" containerName="console" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.573999 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a2e67f-2c70-4ac9-992f-b19afb498a1f" containerName="console" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.574334 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a2e67f-2c70-4ac9-992f-b19afb498a1f" containerName="console" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.575996 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.578247 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.585341 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-dlgdr"] Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.640544 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-config\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.640606 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.640644 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.640971 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.641079 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knpqt\" (UniqueName: \"kubernetes.io/projected/3aafd673-c861-4aa3-bbe0-014cc1ad277a-kube-api-access-knpqt\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.703374 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.743354 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.743405 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-config\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.743485 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.743595 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.743672 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knpqt\" (UniqueName: \"kubernetes.io/projected/3aafd673-c861-4aa3-bbe0-014cc1ad277a-kube-api-access-knpqt\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.744440 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-config\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.744642 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.744797 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.745134 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.763466 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knpqt\" (UniqueName: \"kubernetes.io/projected/3aafd673-c861-4aa3-bbe0-014cc1ad277a-kube-api-access-knpqt\") pod \"dnsmasq-dns-86db49b7ff-dlgdr\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:39 crc kubenswrapper[4702]: I1203 11:27:39.912880 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:40 crc kubenswrapper[4702]: I1203 11:27:40.191256 4702 generic.go:334] "Generic (PLEG): container finished" podID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerID="8affc06ec563ea3872435075db71a24285dd187ff576b93bcb48ab670b0e4a15" exitCode=0 Dec 03 11:27:40 crc kubenswrapper[4702]: I1203 11:27:40.191370 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3","Type":"ContainerDied","Data":"8affc06ec563ea3872435075db71a24285dd187ff576b93bcb48ab670b0e4a15"} Dec 03 11:27:40 crc kubenswrapper[4702]: I1203 11:27:40.194086 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85f53e1b-50d1-4249-ba44-5b2e5982ae36","Type":"ContainerStarted","Data":"d163d9269091867cae17f00eb5509c72849aa1f770ab1a3c24d29e4611f08a54"} Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:40.901948 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-dlgdr"] Dec 03 11:27:41 crc kubenswrapper[4702]: W1203 11:27:40.915018 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aafd673_c861_4aa3_bbe0_014cc1ad277a.slice/crio-217c89c1ad4efe057b880e4689279fa2bd76ceebdd6ee4cabf642040a7193ee3 WatchSource:0}: Error finding container 217c89c1ad4efe057b880e4689279fa2bd76ceebdd6ee4cabf642040a7193ee3: Status 404 returned error can't find the container with id 217c89c1ad4efe057b880e4689279fa2bd76ceebdd6ee4cabf642040a7193ee3 Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.287351 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3","Type":"ContainerStarted","Data":"c7ed9f1ef289cc5aacad80a6cda5674b5b2d84db5935b0d3f1326bc5cd93e425"} Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.294857 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c","Type":"ContainerStarted","Data":"eebb24be9f7e8f7283478723e52c16d5a5d11bc79614ec5ce8495c886d468846"} Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.299448 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" event={"ID":"68d6c68f-856a-48b5-8ca3-e1165b430d65","Type":"ContainerStarted","Data":"bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc"} Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.299685 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" containerName="dnsmasq-dns" containerID="cri-o://bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc" gracePeriod=10 Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.300033 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.306945 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw" event={"ID":"e77b1727-1835-42aa-a4f6-d902ff001d20","Type":"ContainerStarted","Data":"bf752d18db3eeebb7f0b200dd0dd1fd213fabdec3b2a824b1d0f18c174c07160"} Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.307984 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5v7lw" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.315808 4702 generic.go:334] "Generic (PLEG): container finished" podID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerID="f61b588e9a863fe95560ecb766779f12b517ef053b82188e823fef6a9863871f" exitCode=0 Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.315893 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c91e1dc8-ef80-407f-ac34-4c9ab29026f7","Type":"ContainerDied","Data":"f61b588e9a863fe95560ecb766779f12b517ef053b82188e823fef6a9863871f"} Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.327508 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" event={"ID":"3aafd673-c861-4aa3-bbe0-014cc1ad277a","Type":"ContainerStarted","Data":"217c89c1ad4efe057b880e4689279fa2bd76ceebdd6ee4cabf642040a7193ee3"} Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.340599 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.294577079 podStartE2EDuration="1m2.340579159s" podCreationTimestamp="2025-12-03 11:26:39 +0000 UTC" firstStartedPulling="2025-12-03 11:26:42.492405717 +0000 UTC m=+1386.328334181" lastFinishedPulling="2025-12-03 11:27:35.538407797 +0000 UTC m=+1439.374336261" observedRunningTime="2025-12-03 11:27:41.338477789 +0000 UTC m=+1445.174406253" watchObservedRunningTime="2025-12-03 11:27:41.340579159 +0000 UTC m=+1445.176507623" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.420574 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5v7lw" podStartSLOduration=3.591852821 podStartE2EDuration="51.420550374s" podCreationTimestamp="2025-12-03 11:26:50 +0000 UTC" firstStartedPulling="2025-12-03 11:26:52.65482898 +0000 UTC m=+1396.490757444" lastFinishedPulling="2025-12-03 11:27:40.483526533 +0000 UTC m=+1444.319454997" observedRunningTime="2025-12-03 11:27:41.415608814 +0000 UTC m=+1445.251537278" watchObservedRunningTime="2025-12-03 11:27:41.420550374 +0000 UTC m=+1445.256478838" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.487600 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" podStartSLOduration=-9223371972.367203 podStartE2EDuration="1m4.487572802s" podCreationTimestamp="2025-12-03 11:26:37 +0000 UTC" firstStartedPulling="2025-12-03 11:26:38.788306631 +0000 UTC m=+1382.624235095" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:27:41.477279841 +0000 UTC m=+1445.313208315" watchObservedRunningTime="2025-12-03 11:27:41.487572802 +0000 UTC m=+1445.323501266" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.702867 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.833080 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.901231 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s2fp\" (UniqueName: \"kubernetes.io/projected/68d6c68f-856a-48b5-8ca3-e1165b430d65-kube-api-access-7s2fp\") pod \"68d6c68f-856a-48b5-8ca3-e1165b430d65\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.901358 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-dns-svc\") pod \"68d6c68f-856a-48b5-8ca3-e1165b430d65\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.901426 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-config\") pod \"68d6c68f-856a-48b5-8ca3-e1165b430d65\" (UID: \"68d6c68f-856a-48b5-8ca3-e1165b430d65\") " Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.908883 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d6c68f-856a-48b5-8ca3-e1165b430d65-kube-api-access-7s2fp" (OuterVolumeSpecName: "kube-api-access-7s2fp") pod "68d6c68f-856a-48b5-8ca3-e1165b430d65" (UID: "68d6c68f-856a-48b5-8ca3-e1165b430d65"). InnerVolumeSpecName "kube-api-access-7s2fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.953480 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "68d6c68f-856a-48b5-8ca3-e1165b430d65" (UID: "68d6c68f-856a-48b5-8ca3-e1165b430d65"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:41 crc kubenswrapper[4702]: I1203 11:27:41.954675 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-config" (OuterVolumeSpecName: "config") pod "68d6c68f-856a-48b5-8ca3-e1165b430d65" (UID: "68d6c68f-856a-48b5-8ca3-e1165b430d65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.004079 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s2fp\" (UniqueName: \"kubernetes.io/projected/68d6c68f-856a-48b5-8ca3-e1165b430d65-kube-api-access-7s2fp\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.004115 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.004125 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68d6c68f-856a-48b5-8ca3-e1165b430d65-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.030277 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.073653 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.080257 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 11:27:42 crc kubenswrapper[4702]: E1203 11:27:42.280634 4702 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.176:40612->38.102.83.176:40897: read tcp 38.102.83.176:40612->38.102.83.176:40897: read: connection reset by peer Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.339914 4702 generic.go:334] "Generic (PLEG): container finished" podID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" containerID="1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243" exitCode=0 Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.339974 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" event={"ID":"3aafd673-c861-4aa3-bbe0-014cc1ad277a","Type":"ContainerDied","Data":"1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243"} Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.342783 4702 generic.go:334] "Generic (PLEG): container finished" podID="68d6c68f-856a-48b5-8ca3-e1165b430d65" containerID="bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc" exitCode=0 Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.342857 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" event={"ID":"68d6c68f-856a-48b5-8ca3-e1165b430d65","Type":"ContainerDied","Data":"bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc"} Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.342887 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" event={"ID":"68d6c68f-856a-48b5-8ca3-e1165b430d65","Type":"ContainerDied","Data":"ecfb4b6a7b0852b29e7110061d2061f5b53711f6bab565a025802becbf6ec20d"} Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.342910 4702 scope.go:117] "RemoveContainer" containerID="bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.342981 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-9z8nk" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.357148 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c91e1dc8-ef80-407f-ac34-4c9ab29026f7","Type":"ContainerStarted","Data":"9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290"} Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.358317 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.404917 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371974.449883 podStartE2EDuration="1m2.404893564s" podCreationTimestamp="2025-12-03 11:26:40 +0000 UTC" firstStartedPulling="2025-12-03 11:26:42.831642116 +0000 UTC m=+1386.667570580" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:27:42.393716737 +0000 UTC m=+1446.229645201" watchObservedRunningTime="2025-12-03 11:27:42.404893564 +0000 UTC m=+1446.240822028" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.496711 4702 scope.go:117] "RemoveContainer" containerID="c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.529213 4702 scope.go:117] "RemoveContainer" containerID="bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc" Dec 03 11:27:42 crc kubenswrapper[4702]: E1203 11:27:42.529690 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc\": container with ID starting with bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc not found: ID does not exist" containerID="bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.529785 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc"} err="failed to get container status \"bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc\": rpc error: code = NotFound desc = could not find container \"bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc\": container with ID starting with bfe21e625b6a102304f9f3f2340b9b862eac37264bd044429e9d53e03f964bdc not found: ID does not exist" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.529835 4702 scope.go:117] "RemoveContainer" containerID="c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.530277 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9z8nk"] Dec 03 11:27:42 crc kubenswrapper[4702]: E1203 11:27:42.530349 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869\": container with ID starting with c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869 not found: ID does not exist" containerID="c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.530389 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869"} err="failed to get container status \"c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869\": rpc error: code = NotFound desc = could not find container \"c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869\": container with ID starting with c9eb393bbe9f6464638c208a660ae2aa932e62e4b08a0a2e16dc8f5d80f52869 not found: ID does not exist" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.540313 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-9z8nk"] Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.703969 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.800345 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 11:27:42 crc kubenswrapper[4702]: I1203 11:27:42.941492 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" path="/var/lib/kubelet/pods/68d6c68f-856a-48b5-8ca3-e1165b430d65/volumes" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.084686 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.371084 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" event={"ID":"3aafd673-c861-4aa3-bbe0-014cc1ad277a","Type":"ContainerStarted","Data":"a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a"} Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.397075 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" podStartSLOduration=4.397051776 podStartE2EDuration="4.397051776s" podCreationTimestamp="2025-12-03 11:27:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:27:43.394144084 +0000 UTC m=+1447.230072548" watchObservedRunningTime="2025-12-03 11:27:43.397051776 +0000 UTC m=+1447.232980240" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.419703 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.605269 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 11:27:43 crc kubenswrapper[4702]: E1203 11:27:43.605885 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" containerName="init" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.605907 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" containerName="init" Dec 03 11:27:43 crc kubenswrapper[4702]: E1203 11:27:43.605927 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" containerName="dnsmasq-dns" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.605935 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" containerName="dnsmasq-dns" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.606212 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d6c68f-856a-48b5-8ca3-e1165b430d65" containerName="dnsmasq-dns" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.607426 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.610483 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.610699 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.611720 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-g7hb5" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.620053 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.621145 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.745741 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.745919 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.746139 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr5zg\" (UniqueName: \"kubernetes.io/projected/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-kube-api-access-fr5zg\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.746179 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.746217 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-scripts\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.746238 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-config\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.746365 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.848481 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.848550 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.848620 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr5zg\" (UniqueName: \"kubernetes.io/projected/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-kube-api-access-fr5zg\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.848648 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.848674 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-scripts\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.848726 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-config\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.848804 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.849231 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.849851 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-config\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.849952 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-scripts\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.856264 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.856516 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.860549 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.875528 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr5zg\" (UniqueName: \"kubernetes.io/projected/6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9-kube-api-access-fr5zg\") pod \"ovn-northd-0\" (UID: \"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9\") " pod="openstack/ovn-northd-0" Dec 03 11:27:43 crc kubenswrapper[4702]: I1203 11:27:43.941651 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.380920 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.491449 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.857273 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-dlgdr"] Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.916888 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-vsswz"] Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.920064 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.955807 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vsswz"] Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.974784 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.974916 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-dns-svc\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.976632 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-749hj\" (UniqueName: \"kubernetes.io/projected/07456a7d-9f55-45e0-a920-d6cf1d092c0c-kube-api-access-749hj\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.976729 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-config\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:44 crc kubenswrapper[4702]: I1203 11:27:44.976870 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.078858 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-749hj\" (UniqueName: \"kubernetes.io/projected/07456a7d-9f55-45e0-a920-d6cf1d092c0c-kube-api-access-749hj\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.079224 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-config\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.079301 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.079409 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.079472 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-dns-svc\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.080557 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.080604 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-dns-svc\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.080689 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.080856 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-config\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.113731 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-749hj\" (UniqueName: \"kubernetes.io/projected/07456a7d-9f55-45e0-a920-d6cf1d092c0c-kube-api-access-749hj\") pod \"dnsmasq-dns-698758b865-vsswz\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.260184 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.391868 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9","Type":"ContainerStarted","Data":"bef77904a71971a52ac53c4e7bef7c7ae8bd1a290bbaa6e8d5d66898fd6a6e76"} Dec 03 11:27:45 crc kubenswrapper[4702]: I1203 11:27:45.967003 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vsswz"] Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.039194 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.054954 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: W1203 11:27:46.056952 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07456a7d_9f55_45e0_a920_d6cf1d092c0c.slice/crio-d2cf77579eaaddc260551a4b925c52c83b25943033b74f2b37267d84fb3bb26c WatchSource:0}: Error finding container d2cf77579eaaddc260551a4b925c52c83b25943033b74f2b37267d84fb3bb26c: Status 404 returned error can't find the container with id d2cf77579eaaddc260551a4b925c52c83b25943033b74f2b37267d84fb3bb26c Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.057951 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.058251 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.058445 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-hz9vv" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.058616 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.078066 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.114368 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkts\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-kube-api-access-8kkts\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.114509 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3892571c-86ff-4259-beaa-6033dcfda204-cache\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.114670 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3892571c-86ff-4259-beaa-6033dcfda204-lock\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.114709 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.114831 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.217251 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkts\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-kube-api-access-8kkts\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.217975 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3892571c-86ff-4259-beaa-6033dcfda204-cache\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.218717 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/3892571c-86ff-4259-beaa-6033dcfda204-cache\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.219370 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3892571c-86ff-4259-beaa-6033dcfda204-lock\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.219660 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/3892571c-86ff-4259-beaa-6033dcfda204-lock\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.219726 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: E1203 11:27:46.219894 4702 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 11:27:46 crc kubenswrapper[4702]: E1203 11:27:46.219915 4702 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 11:27:46 crc kubenswrapper[4702]: E1203 11:27:46.219978 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift podName:3892571c-86ff-4259-beaa-6033dcfda204 nodeName:}" failed. No retries permitted until 2025-12-03 11:27:46.719952552 +0000 UTC m=+1450.555881006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift") pod "swift-storage-0" (UID: "3892571c-86ff-4259-beaa-6033dcfda204") : configmap "swift-ring-files" not found Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.220255 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.220569 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.237829 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkts\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-kube-api-access-8kkts\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.253796 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.406314 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vsswz" event={"ID":"07456a7d-9f55-45e0-a920-d6cf1d092c0c","Type":"ContainerStarted","Data":"d2cf77579eaaddc260551a4b925c52c83b25943033b74f2b37267d84fb3bb26c"} Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.409798 4702 generic.go:334] "Generic (PLEG): container finished" podID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerID="0cdd14cd1c46fed0871c7e58f04bc827118bac9d8a9e97af12dfaff332296b46" exitCode=0 Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.409893 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerDied","Data":"0cdd14cd1c46fed0871c7e58f04bc827118bac9d8a9e97af12dfaff332296b46"} Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.410073 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" podUID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" containerName="dnsmasq-dns" containerID="cri-o://a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a" gracePeriod=10 Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.701507 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-whljp"] Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.703621 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.710225 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.710614 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.710829 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.734619 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:46 crc kubenswrapper[4702]: E1203 11:27:46.734913 4702 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 11:27:46 crc kubenswrapper[4702]: E1203 11:27:46.734934 4702 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 11:27:46 crc kubenswrapper[4702]: E1203 11:27:46.734993 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift podName:3892571c-86ff-4259-beaa-6033dcfda204 nodeName:}" failed. No retries permitted until 2025-12-03 11:27:47.73497303 +0000 UTC m=+1451.570901494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift") pod "swift-storage-0" (UID: "3892571c-86ff-4259-beaa-6033dcfda204") : configmap "swift-ring-files" not found Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.747309 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-whljp"] Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.836916 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-scripts\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.836975 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-combined-ca-bundle\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.837009 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-ring-data-devices\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.837094 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcc463a3-c5e5-443e-98d1-306cc779e62e-etc-swift\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.837132 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-swiftconf\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.837214 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck2qr\" (UniqueName: \"kubernetes.io/projected/dcc463a3-c5e5-443e-98d1-306cc779e62e-kube-api-access-ck2qr\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.837320 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-dispersionconf\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.939177 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-dispersionconf\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.939618 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-scripts\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.939652 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-combined-ca-bundle\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.939694 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-ring-data-devices\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.939792 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcc463a3-c5e5-443e-98d1-306cc779e62e-etc-swift\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.939826 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-swiftconf\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.939907 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck2qr\" (UniqueName: \"kubernetes.io/projected/dcc463a3-c5e5-443e-98d1-306cc779e62e-kube-api-access-ck2qr\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.941252 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-scripts\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.947502 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcc463a3-c5e5-443e-98d1-306cc779e62e-etc-swift\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.948153 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-ring-data-devices\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.955501 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-combined-ca-bundle\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.956368 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-dispersionconf\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.963096 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck2qr\" (UniqueName: \"kubernetes.io/projected/dcc463a3-c5e5-443e-98d1-306cc779e62e-kube-api-access-ck2qr\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:46 crc kubenswrapper[4702]: I1203 11:27:46.970370 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-swiftconf\") pod \"swift-ring-rebalance-whljp\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.060344 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.350145 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.435411 4702 generic.go:334] "Generic (PLEG): container finished" podID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerID="d8b5cfb6ce58911227d6d7a979a32b1c57ea50c992814dc47435900ca724b341" exitCode=0 Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.436162 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vsswz" event={"ID":"07456a7d-9f55-45e0-a920-d6cf1d092c0c","Type":"ContainerDied","Data":"d8b5cfb6ce58911227d6d7a979a32b1c57ea50c992814dc47435900ca724b341"} Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.443268 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zqdp5" event={"ID":"163cc47f-d241-4b3e-bf62-07f49047de5d","Type":"ContainerStarted","Data":"1ae6d31c7aeed94f53dd7cd6e02f0a538f9be0b81f7787ac39c2728abbefc3e8"} Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.451691 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9","Type":"ContainerStarted","Data":"bde39fc99a021238f3a3a357718b1d7fd1af202cce22942a63df81a190a73e08"} Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.451773 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9","Type":"ContainerStarted","Data":"95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038"} Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.453072 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.457010 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knpqt\" (UniqueName: \"kubernetes.io/projected/3aafd673-c861-4aa3-bbe0-014cc1ad277a-kube-api-access-knpqt\") pod \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.457076 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-dns-svc\") pod \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.457204 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-nb\") pod \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.457236 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-sb\") pod \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.457257 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-config\") pod \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\" (UID: \"3aafd673-c861-4aa3-bbe0-014cc1ad277a\") " Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.459830 4702 generic.go:334] "Generic (PLEG): container finished" podID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" containerID="a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a" exitCode=0 Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.459894 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" event={"ID":"3aafd673-c861-4aa3-bbe0-014cc1ad277a","Type":"ContainerDied","Data":"a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a"} Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.459929 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" event={"ID":"3aafd673-c861-4aa3-bbe0-014cc1ad277a","Type":"ContainerDied","Data":"217c89c1ad4efe057b880e4689279fa2bd76ceebdd6ee4cabf642040a7193ee3"} Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.459950 4702 scope.go:117] "RemoveContainer" containerID="a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.460179 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-dlgdr" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.463255 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3aafd673-c861-4aa3-bbe0-014cc1ad277a-kube-api-access-knpqt" (OuterVolumeSpecName: "kube-api-access-knpqt") pod "3aafd673-c861-4aa3-bbe0-014cc1ad277a" (UID: "3aafd673-c861-4aa3-bbe0-014cc1ad277a"). InnerVolumeSpecName "kube-api-access-knpqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.561510 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knpqt\" (UniqueName: \"kubernetes.io/projected/3aafd673-c861-4aa3-bbe0-014cc1ad277a-kube-api-access-knpqt\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.571554 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3aafd673-c861-4aa3-bbe0-014cc1ad277a" (UID: "3aafd673-c861-4aa3-bbe0-014cc1ad277a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.596484 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-config" (OuterVolumeSpecName: "config") pod "3aafd673-c861-4aa3-bbe0-014cc1ad277a" (UID: "3aafd673-c861-4aa3-bbe0-014cc1ad277a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.598171 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.969827568 podStartE2EDuration="4.598141748s" podCreationTimestamp="2025-12-03 11:27:43 +0000 UTC" firstStartedPulling="2025-12-03 11:27:44.487623626 +0000 UTC m=+1448.323552090" lastFinishedPulling="2025-12-03 11:27:46.115937806 +0000 UTC m=+1449.951866270" observedRunningTime="2025-12-03 11:27:47.518826612 +0000 UTC m=+1451.354755096" watchObservedRunningTime="2025-12-03 11:27:47.598141748 +0000 UTC m=+1451.434070202" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.605506 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3aafd673-c861-4aa3-bbe0-014cc1ad277a" (UID: "3aafd673-c861-4aa3-bbe0-014cc1ad277a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.659988 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3aafd673-c861-4aa3-bbe0-014cc1ad277a" (UID: "3aafd673-c861-4aa3-bbe0-014cc1ad277a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.664178 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.664216 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.664228 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.664237 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3aafd673-c861-4aa3-bbe0-014cc1ad277a-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.669423 4702 scope.go:117] "RemoveContainer" containerID="1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.843172 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:47 crc kubenswrapper[4702]: E1203 11:27:47.844737 4702 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 11:27:47 crc kubenswrapper[4702]: E1203 11:27:47.844850 4702 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 11:27:47 crc kubenswrapper[4702]: E1203 11:27:47.844920 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift podName:3892571c-86ff-4259-beaa-6033dcfda204 nodeName:}" failed. No retries permitted until 2025-12-03 11:27:49.844896938 +0000 UTC m=+1453.680825402 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift") pod "swift-storage-0" (UID: "3892571c-86ff-4259-beaa-6033dcfda204") : configmap "swift-ring-files" not found Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.853559 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-whljp"] Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.928848 4702 scope.go:117] "RemoveContainer" containerID="a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a" Dec 03 11:27:47 crc kubenswrapper[4702]: E1203 11:27:47.931009 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a\": container with ID starting with a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a not found: ID does not exist" containerID="a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.931075 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a"} err="failed to get container status \"a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a\": rpc error: code = NotFound desc = could not find container \"a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a\": container with ID starting with a5ee46b0f4b5b666f3de4933d8d80110ac0df8cbc458fccb7633c1e90340234a not found: ID does not exist" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.931114 4702 scope.go:117] "RemoveContainer" containerID="1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243" Dec 03 11:27:47 crc kubenswrapper[4702]: E1203 11:27:47.932793 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243\": container with ID starting with 1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243 not found: ID does not exist" containerID="1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.932837 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243"} err="failed to get container status \"1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243\": rpc error: code = NotFound desc = could not find container \"1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243\": container with ID starting with 1dd88d67bb78af213143e51a11468ed825f0b88f155271f58874dad7cc53a243 not found: ID does not exist" Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.941364 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-dlgdr"] Dec 03 11:27:47 crc kubenswrapper[4702]: I1203 11:27:47.964375 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-dlgdr"] Dec 03 11:27:48 crc kubenswrapper[4702]: E1203 11:27:48.210589 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3aafd673_c861_4aa3_bbe0_014cc1ad277a.slice\": RecentStats: unable to find data in memory cache]" Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.476306 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-whljp" event={"ID":"dcc463a3-c5e5-443e-98d1-306cc779e62e","Type":"ContainerStarted","Data":"f5dcae6599828509458bbbd7430ca1237c1eea00f5de6d4d06631aaaf5c6b2af"} Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.480326 4702 generic.go:334] "Generic (PLEG): container finished" podID="163cc47f-d241-4b3e-bf62-07f49047de5d" containerID="1ae6d31c7aeed94f53dd7cd6e02f0a538f9be0b81f7787ac39c2728abbefc3e8" exitCode=0 Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.480570 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zqdp5" event={"ID":"163cc47f-d241-4b3e-bf62-07f49047de5d","Type":"ContainerDied","Data":"1ae6d31c7aeed94f53dd7cd6e02f0a538f9be0b81f7787ac39c2728abbefc3e8"} Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.480746 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zqdp5" event={"ID":"163cc47f-d241-4b3e-bf62-07f49047de5d","Type":"ContainerStarted","Data":"fe79f81287bb08d8f8b7b5288c698cc3b531a5d718ea6d60dad6ce21a30ea1db"} Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.489980 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vsswz" event={"ID":"07456a7d-9f55-45e0-a920-d6cf1d092c0c","Type":"ContainerStarted","Data":"ad7cabe93a2e69ec27f0611b2889856e663aa31a9cd3777405c044cfaf455192"} Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.490254 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.497947 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7e1497f-e194-429b-add6-ee8e886fed8b","Type":"ContainerStarted","Data":"9a23dba3e4129629423cd8076d76daa7f2d38ac156b44dd7bcb69f84f91ae987"} Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.498868 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.524810 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-vsswz" podStartSLOduration=4.524786045 podStartE2EDuration="4.524786045s" podCreationTimestamp="2025-12-03 11:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:27:48.512162367 +0000 UTC m=+1452.348090831" watchObservedRunningTime="2025-12-03 11:27:48.524786045 +0000 UTC m=+1452.360714509" Dec 03 11:27:48 crc kubenswrapper[4702]: I1203 11:27:48.944821 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" path="/var/lib/kubelet/pods/3aafd673-c861-4aa3-bbe0-014cc1ad277a/volumes" Dec 03 11:27:49 crc kubenswrapper[4702]: I1203 11:27:49.735398 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-zqdp5" event={"ID":"163cc47f-d241-4b3e-bf62-07f49047de5d","Type":"ContainerStarted","Data":"145426bca38b3ae42c0b7df2add8fb2616e5f41af4e819e3b81376e35593ff6c"} Dec 03 11:27:49 crc kubenswrapper[4702]: I1203 11:27:49.736743 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:27:49 crc kubenswrapper[4702]: I1203 11:27:49.736877 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:27:49 crc kubenswrapper[4702]: I1203 11:27:49.823116 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-zqdp5" podStartSLOduration=6.416752593 podStartE2EDuration="59.823080918s" podCreationTimestamp="2025-12-03 11:26:50 +0000 UTC" firstStartedPulling="2025-12-03 11:26:52.65485242 +0000 UTC m=+1396.490780884" lastFinishedPulling="2025-12-03 11:27:46.061180745 +0000 UTC m=+1449.897109209" observedRunningTime="2025-12-03 11:27:49.811919022 +0000 UTC m=+1453.647847506" watchObservedRunningTime="2025-12-03 11:27:49.823080918 +0000 UTC m=+1453.659009392" Dec 03 11:27:49 crc kubenswrapper[4702]: I1203 11:27:49.834713 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=4.36838669 podStartE2EDuration="1m5.834685397s" podCreationTimestamp="2025-12-03 11:26:44 +0000 UTC" firstStartedPulling="2025-12-03 11:26:45.95667493 +0000 UTC m=+1389.792603394" lastFinishedPulling="2025-12-03 11:27:47.422973637 +0000 UTC m=+1451.258902101" observedRunningTime="2025-12-03 11:27:48.549910037 +0000 UTC m=+1452.385838521" watchObservedRunningTime="2025-12-03 11:27:49.834685397 +0000 UTC m=+1453.670613861" Dec 03 11:27:49 crc kubenswrapper[4702]: I1203 11:27:49.897294 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:49 crc kubenswrapper[4702]: E1203 11:27:49.899089 4702 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 11:27:49 crc kubenswrapper[4702]: E1203 11:27:49.899117 4702 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 11:27:49 crc kubenswrapper[4702]: E1203 11:27:49.899546 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift podName:3892571c-86ff-4259-beaa-6033dcfda204 nodeName:}" failed. No retries permitted until 2025-12-03 11:27:53.899145942 +0000 UTC m=+1457.735074406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift") pod "swift-storage-0" (UID: "3892571c-86ff-4259-beaa-6033dcfda204") : configmap "swift-ring-files" not found Dec 03 11:27:51 crc kubenswrapper[4702]: I1203 11:27:51.226120 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 11:27:51 crc kubenswrapper[4702]: I1203 11:27:51.226216 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 11:27:51 crc kubenswrapper[4702]: I1203 11:27:51.328804 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 11:27:51 crc kubenswrapper[4702]: I1203 11:27:51.782031 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 11:27:51 crc kubenswrapper[4702]: I1203 11:27:51.782680 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 11:27:51 crc kubenswrapper[4702]: I1203 11:27:51.857726 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 11:27:51 crc kubenswrapper[4702]: I1203 11:27:51.918637 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.360704 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-c7c0-account-create-update-qf8t8"] Dec 03 11:27:52 crc kubenswrapper[4702]: E1203 11:27:52.361206 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" containerName="dnsmasq-dns" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.361222 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" containerName="dnsmasq-dns" Dec 03 11:27:52 crc kubenswrapper[4702]: E1203 11:27:52.361237 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" containerName="init" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.361243 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" containerName="init" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.361434 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="3aafd673-c861-4aa3-bbe0-014cc1ad277a" containerName="dnsmasq-dns" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.362273 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.369203 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.376228 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-f62wv"] Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.377818 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f62wv" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.399339 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c7c0-account-create-update-qf8t8"] Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.407884 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-f62wv"] Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.465932 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jljpn\" (UniqueName: \"kubernetes.io/projected/04333214-a420-40c6-bcd4-0d50544955ec-kube-api-access-jljpn\") pod \"glance-c7c0-account-create-update-qf8t8\" (UID: \"04333214-a420-40c6-bcd4-0d50544955ec\") " pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.466009 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04333214-a420-40c6-bcd4-0d50544955ec-operator-scripts\") pod \"glance-c7c0-account-create-update-qf8t8\" (UID: \"04333214-a420-40c6-bcd4-0d50544955ec\") " pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.466076 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2vn\" (UniqueName: \"kubernetes.io/projected/9e688d41-fb86-42a2-ae40-57d585b44357-kube-api-access-pt2vn\") pod \"glance-db-create-f62wv\" (UID: \"9e688d41-fb86-42a2-ae40-57d585b44357\") " pod="openstack/glance-db-create-f62wv" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.466157 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e688d41-fb86-42a2-ae40-57d585b44357-operator-scripts\") pod \"glance-db-create-f62wv\" (UID: \"9e688d41-fb86-42a2-ae40-57d585b44357\") " pod="openstack/glance-db-create-f62wv" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.611235 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jljpn\" (UniqueName: \"kubernetes.io/projected/04333214-a420-40c6-bcd4-0d50544955ec-kube-api-access-jljpn\") pod \"glance-c7c0-account-create-update-qf8t8\" (UID: \"04333214-a420-40c6-bcd4-0d50544955ec\") " pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.611299 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04333214-a420-40c6-bcd4-0d50544955ec-operator-scripts\") pod \"glance-c7c0-account-create-update-qf8t8\" (UID: \"04333214-a420-40c6-bcd4-0d50544955ec\") " pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.611358 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2vn\" (UniqueName: \"kubernetes.io/projected/9e688d41-fb86-42a2-ae40-57d585b44357-kube-api-access-pt2vn\") pod \"glance-db-create-f62wv\" (UID: \"9e688d41-fb86-42a2-ae40-57d585b44357\") " pod="openstack/glance-db-create-f62wv" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.611419 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e688d41-fb86-42a2-ae40-57d585b44357-operator-scripts\") pod \"glance-db-create-f62wv\" (UID: \"9e688d41-fb86-42a2-ae40-57d585b44357\") " pod="openstack/glance-db-create-f62wv" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.612357 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e688d41-fb86-42a2-ae40-57d585b44357-operator-scripts\") pod \"glance-db-create-f62wv\" (UID: \"9e688d41-fb86-42a2-ae40-57d585b44357\") " pod="openstack/glance-db-create-f62wv" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.612395 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04333214-a420-40c6-bcd4-0d50544955ec-operator-scripts\") pod \"glance-c7c0-account-create-update-qf8t8\" (UID: \"04333214-a420-40c6-bcd4-0d50544955ec\") " pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.640978 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2vn\" (UniqueName: \"kubernetes.io/projected/9e688d41-fb86-42a2-ae40-57d585b44357-kube-api-access-pt2vn\") pod \"glance-db-create-f62wv\" (UID: \"9e688d41-fb86-42a2-ae40-57d585b44357\") " pod="openstack/glance-db-create-f62wv" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.644419 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jljpn\" (UniqueName: \"kubernetes.io/projected/04333214-a420-40c6-bcd4-0d50544955ec-kube-api-access-jljpn\") pod \"glance-c7c0-account-create-update-qf8t8\" (UID: \"04333214-a420-40c6-bcd4-0d50544955ec\") " pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.686334 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.702859 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f62wv" Dec 03 11:27:52 crc kubenswrapper[4702]: I1203 11:27:52.882529 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 11:27:53 crc kubenswrapper[4702]: I1203 11:27:53.999028 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:27:54 crc kubenswrapper[4702]: E1203 11:27:53.999207 4702 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 11:27:54 crc kubenswrapper[4702]: E1203 11:27:53.999665 4702 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 11:27:54 crc kubenswrapper[4702]: E1203 11:27:53.999789 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift podName:3892571c-86ff-4259-beaa-6033dcfda204 nodeName:}" failed. No retries permitted until 2025-12-03 11:28:01.999720428 +0000 UTC m=+1465.835648892 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift") pod "swift-storage-0" (UID: "3892571c-86ff-4259-beaa-6033dcfda204") : configmap "swift-ring-files" not found Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.321179 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-kg7cx"] Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.322693 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.350829 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-kg7cx"] Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.510618 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l9cg\" (UniqueName: \"kubernetes.io/projected/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-kube-api-access-7l9cg\") pod \"mysqld-exporter-openstack-db-create-kg7cx\" (UID: \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\") " pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.511718 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-kg7cx\" (UID: \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\") " pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.521914 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.621595 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-kg7cx\" (UID: \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\") " pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.621741 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l9cg\" (UniqueName: \"kubernetes.io/projected/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-kube-api-access-7l9cg\") pod \"mysqld-exporter-openstack-db-create-kg7cx\" (UID: \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\") " pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.622857 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-kg7cx\" (UID: \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\") " pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.665033 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l9cg\" (UniqueName: \"kubernetes.io/projected/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-kube-api-access-7l9cg\") pod \"mysqld-exporter-openstack-db-create-kg7cx\" (UID: \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\") " pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.711131 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-e7ea-account-create-update-mb6qb"] Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.712929 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.715270 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.724969 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cdc33be-f901-4659-a687-a547acd212c0-operator-scripts\") pod \"mysqld-exporter-e7ea-account-create-update-mb6qb\" (UID: \"6cdc33be-f901-4659-a687-a547acd212c0\") " pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.725020 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcnhj\" (UniqueName: \"kubernetes.io/projected/6cdc33be-f901-4659-a687-a547acd212c0-kube-api-access-pcnhj\") pod \"mysqld-exporter-e7ea-account-create-update-mb6qb\" (UID: \"6cdc33be-f901-4659-a687-a547acd212c0\") " pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.736341 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e7ea-account-create-update-mb6qb"] Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.829794 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cdc33be-f901-4659-a687-a547acd212c0-operator-scripts\") pod \"mysqld-exporter-e7ea-account-create-update-mb6qb\" (UID: \"6cdc33be-f901-4659-a687-a547acd212c0\") " pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.829857 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcnhj\" (UniqueName: \"kubernetes.io/projected/6cdc33be-f901-4659-a687-a547acd212c0-kube-api-access-pcnhj\") pod \"mysqld-exporter-e7ea-account-create-update-mb6qb\" (UID: \"6cdc33be-f901-4659-a687-a547acd212c0\") " pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.831021 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cdc33be-f901-4659-a687-a547acd212c0-operator-scripts\") pod \"mysqld-exporter-e7ea-account-create-update-mb6qb\" (UID: \"6cdc33be-f901-4659-a687-a547acd212c0\") " pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.850656 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcnhj\" (UniqueName: \"kubernetes.io/projected/6cdc33be-f901-4659-a687-a547acd212c0-kube-api-access-pcnhj\") pod \"mysqld-exporter-e7ea-account-create-update-mb6qb\" (UID: \"6cdc33be-f901-4659-a687-a547acd212c0\") " pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:27:54 crc kubenswrapper[4702]: I1203 11:27:54.951932 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:27:55 crc kubenswrapper[4702]: I1203 11:27:55.054724 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:27:55 crc kubenswrapper[4702]: I1203 11:27:55.263976 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:27:55 crc kubenswrapper[4702]: I1203 11:27:55.334136 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jh4g"] Dec 03 11:27:55 crc kubenswrapper[4702]: I1203 11:27:55.335374 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="dnsmasq-dns" containerID="cri-o://11bbef9daf6038bfee82c7bba48d9a26add9d4301b0815055b9da4b1a3a4c319" gracePeriod=10 Dec 03 11:27:55 crc kubenswrapper[4702]: I1203 11:27:55.880467 4702 generic.go:334] "Generic (PLEG): container finished" podID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerID="11bbef9daf6038bfee82c7bba48d9a26add9d4301b0815055b9da4b1a3a4c319" exitCode=0 Dec 03 11:27:55 crc kubenswrapper[4702]: I1203 11:27:55.880508 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" event={"ID":"83e06f10-c712-4280-8d28-e8b44ac0810c","Type":"ContainerDied","Data":"11bbef9daf6038bfee82c7bba48d9a26add9d4301b0815055b9da4b1a3a4c319"} Dec 03 11:27:55 crc kubenswrapper[4702]: I1203 11:27:55.909267 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:27:55 crc kubenswrapper[4702]: I1203 11:27:55.909349 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:27:57 crc kubenswrapper[4702]: I1203 11:27:57.702642 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Dec 03 11:27:59 crc kubenswrapper[4702]: I1203 11:27:59.010804 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.537523 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-k57x9"] Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.539545 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.552042 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-k57x9"] Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.623518 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69bpg\" (UniqueName: \"kubernetes.io/projected/4302e064-fe40-4f25-aeb5-44b7e6449131-kube-api-access-69bpg\") pod \"keystone-db-create-k57x9\" (UID: \"4302e064-fe40-4f25-aeb5-44b7e6449131\") " pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.623599 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4302e064-fe40-4f25-aeb5-44b7e6449131-operator-scripts\") pod \"keystone-db-create-k57x9\" (UID: \"4302e064-fe40-4f25-aeb5-44b7e6449131\") " pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.725737 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69bpg\" (UniqueName: \"kubernetes.io/projected/4302e064-fe40-4f25-aeb5-44b7e6449131-kube-api-access-69bpg\") pod \"keystone-db-create-k57x9\" (UID: \"4302e064-fe40-4f25-aeb5-44b7e6449131\") " pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.725836 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4302e064-fe40-4f25-aeb5-44b7e6449131-operator-scripts\") pod \"keystone-db-create-k57x9\" (UID: \"4302e064-fe40-4f25-aeb5-44b7e6449131\") " pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.727151 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4302e064-fe40-4f25-aeb5-44b7e6449131-operator-scripts\") pod \"keystone-db-create-k57x9\" (UID: \"4302e064-fe40-4f25-aeb5-44b7e6449131\") " pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.741138 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6217-account-create-update-tst4w"] Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.746515 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.750973 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.763354 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6217-account-create-update-tst4w"] Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.769362 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69bpg\" (UniqueName: \"kubernetes.io/projected/4302e064-fe40-4f25-aeb5-44b7e6449131-kube-api-access-69bpg\") pod \"keystone-db-create-k57x9\" (UID: \"4302e064-fe40-4f25-aeb5-44b7e6449131\") " pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.862563 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.924676 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-fm24s"] Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.926366 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fm24s" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.934509 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-operator-scripts\") pod \"keystone-6217-account-create-update-tst4w\" (UID: \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\") " pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.934823 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ptdq\" (UniqueName: \"kubernetes.io/projected/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-kube-api-access-2ptdq\") pod \"keystone-6217-account-create-update-tst4w\" (UID: \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\") " pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:01 crc kubenswrapper[4702]: I1203 11:28:01.938339 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-fm24s"] Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.033520 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d28a-account-create-update-25ttw"] Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.035651 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.037025 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.037121 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910db445-4971-456d-8d38-099ce65627ba-operator-scripts\") pod \"placement-db-create-fm24s\" (UID: \"910db445-4971-456d-8d38-099ce65627ba\") " pod="openstack/placement-db-create-fm24s" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.037244 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ptdq\" (UniqueName: \"kubernetes.io/projected/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-kube-api-access-2ptdq\") pod \"keystone-6217-account-create-update-tst4w\" (UID: \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\") " pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.037323 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-operator-scripts\") pod \"keystone-6217-account-create-update-tst4w\" (UID: \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\") " pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:02 crc kubenswrapper[4702]: E1203 11:28:02.037331 4702 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 11:28:02 crc kubenswrapper[4702]: E1203 11:28:02.037373 4702 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.037398 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7txw\" (UniqueName: \"kubernetes.io/projected/910db445-4971-456d-8d38-099ce65627ba-kube-api-access-t7txw\") pod \"placement-db-create-fm24s\" (UID: \"910db445-4971-456d-8d38-099ce65627ba\") " pod="openstack/placement-db-create-fm24s" Dec 03 11:28:02 crc kubenswrapper[4702]: E1203 11:28:02.037441 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift podName:3892571c-86ff-4259-beaa-6033dcfda204 nodeName:}" failed. No retries permitted until 2025-12-03 11:28:18.037417169 +0000 UTC m=+1481.873345633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift") pod "swift-storage-0" (UID: "3892571c-86ff-4259-beaa-6033dcfda204") : configmap "swift-ring-files" not found Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.038358 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-operator-scripts\") pod \"keystone-6217-account-create-update-tst4w\" (UID: \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\") " pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.040824 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.066286 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d28a-account-create-update-25ttw"] Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.077452 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ptdq\" (UniqueName: \"kubernetes.io/projected/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-kube-api-access-2ptdq\") pod \"keystone-6217-account-create-update-tst4w\" (UID: \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\") " pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.135011 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.140677 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910db445-4971-456d-8d38-099ce65627ba-operator-scripts\") pod \"placement-db-create-fm24s\" (UID: \"910db445-4971-456d-8d38-099ce65627ba\") " pod="openstack/placement-db-create-fm24s" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.141082 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpx7k\" (UniqueName: \"kubernetes.io/projected/b7d72b70-ab4b-44db-a489-5daf18efbf68-kube-api-access-wpx7k\") pod \"placement-d28a-account-create-update-25ttw\" (UID: \"b7d72b70-ab4b-44db-a489-5daf18efbf68\") " pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.141175 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7txw\" (UniqueName: \"kubernetes.io/projected/910db445-4971-456d-8d38-099ce65627ba-kube-api-access-t7txw\") pod \"placement-db-create-fm24s\" (UID: \"910db445-4971-456d-8d38-099ce65627ba\") " pod="openstack/placement-db-create-fm24s" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.141229 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d72b70-ab4b-44db-a489-5daf18efbf68-operator-scripts\") pod \"placement-d28a-account-create-update-25ttw\" (UID: \"b7d72b70-ab4b-44db-a489-5daf18efbf68\") " pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.141639 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910db445-4971-456d-8d38-099ce65627ba-operator-scripts\") pod \"placement-db-create-fm24s\" (UID: \"910db445-4971-456d-8d38-099ce65627ba\") " pod="openstack/placement-db-create-fm24s" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.159591 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7txw\" (UniqueName: \"kubernetes.io/projected/910db445-4971-456d-8d38-099ce65627ba-kube-api-access-t7txw\") pod \"placement-db-create-fm24s\" (UID: \"910db445-4971-456d-8d38-099ce65627ba\") " pod="openstack/placement-db-create-fm24s" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.243451 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpx7k\" (UniqueName: \"kubernetes.io/projected/b7d72b70-ab4b-44db-a489-5daf18efbf68-kube-api-access-wpx7k\") pod \"placement-d28a-account-create-update-25ttw\" (UID: \"b7d72b70-ab4b-44db-a489-5daf18efbf68\") " pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.243521 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d72b70-ab4b-44db-a489-5daf18efbf68-operator-scripts\") pod \"placement-d28a-account-create-update-25ttw\" (UID: \"b7d72b70-ab4b-44db-a489-5daf18efbf68\") " pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.244431 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d72b70-ab4b-44db-a489-5daf18efbf68-operator-scripts\") pod \"placement-d28a-account-create-update-25ttw\" (UID: \"b7d72b70-ab4b-44db-a489-5daf18efbf68\") " pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.253498 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fm24s" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.261915 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpx7k\" (UniqueName: \"kubernetes.io/projected/b7d72b70-ab4b-44db-a489-5daf18efbf68-kube-api-access-wpx7k\") pod \"placement-d28a-account-create-update-25ttw\" (UID: \"b7d72b70-ab4b-44db-a489-5daf18efbf68\") " pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.357581 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:02 crc kubenswrapper[4702]: I1203 11:28:02.702683 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Dec 03 11:28:07 crc kubenswrapper[4702]: I1203 11:28:07.703312 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: connect: connection refused" Dec 03 11:28:07 crc kubenswrapper[4702]: I1203 11:28:07.703797 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:28:09 crc kubenswrapper[4702]: E1203 11:28:09.162636 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b" Dec 03 11:28:09 crc kubenswrapper[4702]: E1203 11:28:09.163162 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus,Image:registry.redhat.io/cluster-observability-operator/prometheus-rhel9@sha256:17ea20be390a94ab39f5cdd7f0cbc2498046eebcf77fe3dec9aa288d5c2cf46b,Command:[],Args:[--config.file=/etc/prometheus/config_out/prometheus.env.yaml --web.enable-lifecycle --web.route-prefix=/ --storage.tsdb.retention.time=24h --storage.tsdb.path=/prometheus --web.config.file=/etc/prometheus/web_config/web-config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:web,HostPort:0,ContainerPort:9090,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-out,ReadOnly:true,MountPath:/etc/prometheus/config_out,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tls-assets,ReadOnly:true,MountPath:/etc/prometheus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-db,ReadOnly:false,MountPath:/prometheus,SubPath:prometheus-db,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:prometheus-metric-storage-rulefiles-0,ReadOnly:false,MountPath:/etc/prometheus/rules/prometheus-metric-storage-rulefiles-0,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:web-config,ReadOnly:true,MountPath:/etc/prometheus/web_config/web-config.yaml,SubPath:web-config.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prkgf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/healthy,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/-/ready,Port:{1 0 web},Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:3,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:60,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod prometheus-metric-storage-0_openstack(cf7bd44e-b3b4-4812-b7ee-512fb948d8f9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.134226 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.334476 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-config\") pod \"83e06f10-c712-4280-8d28-e8b44ac0810c\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.334984 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m86xq\" (UniqueName: \"kubernetes.io/projected/83e06f10-c712-4280-8d28-e8b44ac0810c-kube-api-access-m86xq\") pod \"83e06f10-c712-4280-8d28-e8b44ac0810c\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.335132 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-dns-svc\") pod \"83e06f10-c712-4280-8d28-e8b44ac0810c\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.335245 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-ovsdbserver-nb\") pod \"83e06f10-c712-4280-8d28-e8b44ac0810c\" (UID: \"83e06f10-c712-4280-8d28-e8b44ac0810c\") " Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.343109 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e06f10-c712-4280-8d28-e8b44ac0810c-kube-api-access-m86xq" (OuterVolumeSpecName: "kube-api-access-m86xq") pod "83e06f10-c712-4280-8d28-e8b44ac0810c" (UID: "83e06f10-c712-4280-8d28-e8b44ac0810c"). InnerVolumeSpecName "kube-api-access-m86xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:10 crc kubenswrapper[4702]: W1203 11:28:10.380296 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cdc33be_f901_4659_a687_a547acd212c0.slice/crio-6f6aa722f51e7563a99ceb3dd51bd087d003162376e5eae36118a5fe1b2d81ca WatchSource:0}: Error finding container 6f6aa722f51e7563a99ceb3dd51bd087d003162376e5eae36118a5fe1b2d81ca: Status 404 returned error can't find the container with id 6f6aa722f51e7563a99ceb3dd51bd087d003162376e5eae36118a5fe1b2d81ca Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.384006 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-e7ea-account-create-update-mb6qb"] Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.399862 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "83e06f10-c712-4280-8d28-e8b44ac0810c" (UID: "83e06f10-c712-4280-8d28-e8b44ac0810c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.408231 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-config" (OuterVolumeSpecName: "config") pod "83e06f10-c712-4280-8d28-e8b44ac0810c" (UID: "83e06f10-c712-4280-8d28-e8b44ac0810c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.432648 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "83e06f10-c712-4280-8d28-e8b44ac0810c" (UID: "83e06f10-c712-4280-8d28-e8b44ac0810c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.441409 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.441443 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m86xq\" (UniqueName: \"kubernetes.io/projected/83e06f10-c712-4280-8d28-e8b44ac0810c-kube-api-access-m86xq\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.441455 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.441463 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83e06f10-c712-4280-8d28-e8b44ac0810c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.538155 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-fm24s"] Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.576030 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-c7c0-account-create-update-qf8t8"] Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.836284 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5v7lw" podUID="e77b1727-1835-42aa-a4f6-d902ff001d20" containerName="ovn-controller" probeResult="failure" output=< Dec 03 11:28:10 crc kubenswrapper[4702]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 11:28:10 crc kubenswrapper[4702]: > Dec 03 11:28:10 crc kubenswrapper[4702]: I1203 11:28:10.987625 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-k57x9"] Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.002824 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-f62wv"] Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.021146 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6217-account-create-update-tst4w"] Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.028481 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d28a-account-create-update-25ttw"] Dec 03 11:28:11 crc kubenswrapper[4702]: W1203 11:28:11.029367 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e7f42b9_f1fb_49b2_a5f9_ffab3fc3ae3d.slice/crio-1474b06476fbb73cb827ca00a0646b9dfb77eaf437506d5c63fbb332c4d8280b WatchSource:0}: Error finding container 1474b06476fbb73cb827ca00a0646b9dfb77eaf437506d5c63fbb332c4d8280b: Status 404 returned error can't find the container with id 1474b06476fbb73cb827ca00a0646b9dfb77eaf437506d5c63fbb332c4d8280b Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.038492 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-kg7cx"] Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.101782 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c7c0-account-create-update-qf8t8" event={"ID":"04333214-a420-40c6-bcd4-0d50544955ec","Type":"ContainerStarted","Data":"d524df9e1fc3053d38f3ba2633d6aa33df90732620b04d416fc9589069f31f93"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.101849 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c7c0-account-create-update-qf8t8" event={"ID":"04333214-a420-40c6-bcd4-0d50544955ec","Type":"ContainerStarted","Data":"f9b99aaf77dbdbd1eb9c1365c7d43f4a61a5c45166fdb8358f78e0f77565d40d"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.116682 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-whljp" event={"ID":"dcc463a3-c5e5-443e-98d1-306cc779e62e","Type":"ContainerStarted","Data":"28224786c89d5db7db660f5c395137af53230e70a07c75b094ef15a9446b3f8a"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.121849 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-c7c0-account-create-update-qf8t8" podStartSLOduration=19.121825897 podStartE2EDuration="19.121825897s" podCreationTimestamp="2025-12-03 11:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:11.119976265 +0000 UTC m=+1474.955904729" watchObservedRunningTime="2025-12-03 11:28:11.121825897 +0000 UTC m=+1474.957754361" Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.125833 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fm24s" event={"ID":"910db445-4971-456d-8d38-099ce65627ba","Type":"ContainerStarted","Data":"d200e8f2820fd0a42d99e646911887c1b26aaf9263be632e0d32ee7b56e5a041"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.125884 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fm24s" event={"ID":"910db445-4971-456d-8d38-099ce65627ba","Type":"ContainerStarted","Data":"fe53cda21dfec02a089c494f41b0018b4b2dbc647e866d04e07a5e6f0d9d66ea"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.135541 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" event={"ID":"83e06f10-c712-4280-8d28-e8b44ac0810c","Type":"ContainerDied","Data":"ce621d67b7d1b231de17cfee583521bcd32f877ac1fe68b6a39512c80cef8be3"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.135611 4702 scope.go:117] "RemoveContainer" containerID="11bbef9daf6038bfee82c7bba48d9a26add9d4301b0815055b9da4b1a3a4c319" Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.135844 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-4jh4g" Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.146799 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6217-account-create-update-tst4w" event={"ID":"8c1760e7-8de5-48fc-af90-9e8dedf53a3c","Type":"ContainerStarted","Data":"57464916f58e1bc7cb43dc2922d31dcbe54d693fa5e19648fb96f8963ff4b020"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.147055 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-whljp" podStartSLOduration=3.320587144 podStartE2EDuration="25.147033791s" podCreationTimestamp="2025-12-03 11:27:46 +0000 UTC" firstStartedPulling="2025-12-03 11:27:47.865969574 +0000 UTC m=+1451.701898038" lastFinishedPulling="2025-12-03 11:28:09.692416221 +0000 UTC m=+1473.528344685" observedRunningTime="2025-12-03 11:28:11.142556035 +0000 UTC m=+1474.978484499" watchObservedRunningTime="2025-12-03 11:28:11.147033791 +0000 UTC m=+1474.982962255" Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.149415 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" event={"ID":"6cdc33be-f901-4659-a687-a547acd212c0","Type":"ContainerStarted","Data":"00be2d9c2d5b19e1b0781c7d0b2359e2733ea150a4e65bfad6ea6f85acf5c3d0"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.149449 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" event={"ID":"6cdc33be-f901-4659-a687-a547acd212c0","Type":"ContainerStarted","Data":"6f6aa722f51e7563a99ceb3dd51bd087d003162376e5eae36118a5fe1b2d81ca"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.151905 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k57x9" event={"ID":"4302e064-fe40-4f25-aeb5-44b7e6449131","Type":"ContainerStarted","Data":"7c5c14aada330397467a36538548aa1529eb7d1f67f1a597bcc3372b105361c5"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.153486 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f62wv" event={"ID":"9e688d41-fb86-42a2-ae40-57d585b44357","Type":"ContainerStarted","Data":"e7dfceda79839b9fa4398961125cb0c9846c102cf8b38a4be799879ea904717e"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.161933 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d28a-account-create-update-25ttw" event={"ID":"b7d72b70-ab4b-44db-a489-5daf18efbf68","Type":"ContainerStarted","Data":"91718785eb96600b8afbeed54035f25e502c541ac601556fcd9e0bd06116e330"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.168442 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" event={"ID":"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d","Type":"ContainerStarted","Data":"1474b06476fbb73cb827ca00a0646b9dfb77eaf437506d5c63fbb332c4d8280b"} Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.188083 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-fm24s" podStartSLOduration=10.188050633 podStartE2EDuration="10.188050633s" podCreationTimestamp="2025-12-03 11:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:11.161734908 +0000 UTC m=+1474.997663382" watchObservedRunningTime="2025-12-03 11:28:11.188050633 +0000 UTC m=+1475.023979097" Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.190492 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" podStartSLOduration=17.190463852 podStartE2EDuration="17.190463852s" podCreationTimestamp="2025-12-03 11:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:11.181485077 +0000 UTC m=+1475.017413541" watchObservedRunningTime="2025-12-03 11:28:11.190463852 +0000 UTC m=+1475.026392316" Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.204582 4702 scope.go:117] "RemoveContainer" containerID="3bd4b710d4994e2a6d1881641c937942c1d2a9d9b52a68a171ff6690528475a6" Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.227493 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jh4g"] Dec 03 11:28:11 crc kubenswrapper[4702]: I1203 11:28:11.245573 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jh4g"] Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.187131 4702 generic.go:334] "Generic (PLEG): container finished" podID="4302e064-fe40-4f25-aeb5-44b7e6449131" containerID="1b8a6d044dac9d7e14b452174d79dc85aceba5ff4df34d6f9693c9c16ee6f437" exitCode=0 Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.187247 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k57x9" event={"ID":"4302e064-fe40-4f25-aeb5-44b7e6449131","Type":"ContainerDied","Data":"1b8a6d044dac9d7e14b452174d79dc85aceba5ff4df34d6f9693c9c16ee6f437"} Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.195498 4702 generic.go:334] "Generic (PLEG): container finished" podID="9e688d41-fb86-42a2-ae40-57d585b44357" containerID="5f305242b6a0f1936c26115468933c1744dc34f1280701658a7afd5a519f77e1" exitCode=0 Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.195574 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f62wv" event={"ID":"9e688d41-fb86-42a2-ae40-57d585b44357","Type":"ContainerDied","Data":"5f305242b6a0f1936c26115468933c1744dc34f1280701658a7afd5a519f77e1"} Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.197286 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d28a-account-create-update-25ttw" event={"ID":"b7d72b70-ab4b-44db-a489-5daf18efbf68","Type":"ContainerStarted","Data":"c6e3a0909696a8b53bdedf99ff8321d41401e621eba80823eac63dd8fffdcd19"} Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.202343 4702 generic.go:334] "Generic (PLEG): container finished" podID="910db445-4971-456d-8d38-099ce65627ba" containerID="d200e8f2820fd0a42d99e646911887c1b26aaf9263be632e0d32ee7b56e5a041" exitCode=0 Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.202586 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fm24s" event={"ID":"910db445-4971-456d-8d38-099ce65627ba","Type":"ContainerDied","Data":"d200e8f2820fd0a42d99e646911887c1b26aaf9263be632e0d32ee7b56e5a041"} Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.204524 4702 generic.go:334] "Generic (PLEG): container finished" podID="1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d" containerID="11ac08548549e0f609d14ea3b47fc17f9628c005ad96b2cdef45524263932043" exitCode=0 Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.204621 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" event={"ID":"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d","Type":"ContainerDied","Data":"11ac08548549e0f609d14ea3b47fc17f9628c005ad96b2cdef45524263932043"} Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.212468 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6217-account-create-update-tst4w" event={"ID":"8c1760e7-8de5-48fc-af90-9e8dedf53a3c","Type":"ContainerStarted","Data":"38cd2cfc39731f342fb3349689ca00d5bc30d60097d0344fb6d532517716b8a9"} Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.224497 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d28a-account-create-update-25ttw" podStartSLOduration=10.22447705 podStartE2EDuration="10.22447705s" podCreationTimestamp="2025-12-03 11:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:12.219320664 +0000 UTC m=+1476.055249138" watchObservedRunningTime="2025-12-03 11:28:12.22447705 +0000 UTC m=+1476.060405514" Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.308458 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6217-account-create-update-tst4w" podStartSLOduration=11.308428128 podStartE2EDuration="11.308428128s" podCreationTimestamp="2025-12-03 11:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:12.305600237 +0000 UTC m=+1476.141528701" watchObservedRunningTime="2025-12-03 11:28:12.308428128 +0000 UTC m=+1476.144356592" Dec 03 11:28:12 crc kubenswrapper[4702]: I1203 11:28:12.943063 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" path="/var/lib/kubelet/pods/83e06f10-c712-4280-8d28-e8b44ac0810c/volumes" Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.223130 4702 generic.go:334] "Generic (PLEG): container finished" podID="6cdc33be-f901-4659-a687-a547acd212c0" containerID="00be2d9c2d5b19e1b0781c7d0b2359e2733ea150a4e65bfad6ea6f85acf5c3d0" exitCode=0 Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.223182 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" event={"ID":"6cdc33be-f901-4659-a687-a547acd212c0","Type":"ContainerDied","Data":"00be2d9c2d5b19e1b0781c7d0b2359e2733ea150a4e65bfad6ea6f85acf5c3d0"} Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.225802 4702 generic.go:334] "Generic (PLEG): container finished" podID="04333214-a420-40c6-bcd4-0d50544955ec" containerID="d524df9e1fc3053d38f3ba2633d6aa33df90732620b04d416fc9589069f31f93" exitCode=0 Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.225873 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c7c0-account-create-update-qf8t8" event={"ID":"04333214-a420-40c6-bcd4-0d50544955ec","Type":"ContainerDied","Data":"d524df9e1fc3053d38f3ba2633d6aa33df90732620b04d416fc9589069f31f93"} Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.229134 4702 generic.go:334] "Generic (PLEG): container finished" podID="b7d72b70-ab4b-44db-a489-5daf18efbf68" containerID="c6e3a0909696a8b53bdedf99ff8321d41401e621eba80823eac63dd8fffdcd19" exitCode=0 Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.229219 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d28a-account-create-update-25ttw" event={"ID":"b7d72b70-ab4b-44db-a489-5daf18efbf68","Type":"ContainerDied","Data":"c6e3a0909696a8b53bdedf99ff8321d41401e621eba80823eac63dd8fffdcd19"} Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.234480 4702 generic.go:334] "Generic (PLEG): container finished" podID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerID="eebb24be9f7e8f7283478723e52c16d5a5d11bc79614ec5ce8495c886d468846" exitCode=0 Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.234582 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c","Type":"ContainerDied","Data":"eebb24be9f7e8f7283478723e52c16d5a5d11bc79614ec5ce8495c886d468846"} Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.236582 4702 generic.go:334] "Generic (PLEG): container finished" podID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerID="d163d9269091867cae17f00eb5509c72849aa1f770ab1a3c24d29e4611f08a54" exitCode=0 Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.236673 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85f53e1b-50d1-4249-ba44-5b2e5982ae36","Type":"ContainerDied","Data":"d163d9269091867cae17f00eb5509c72849aa1f770ab1a3c24d29e4611f08a54"} Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.242259 4702 generic.go:334] "Generic (PLEG): container finished" podID="8c1760e7-8de5-48fc-af90-9e8dedf53a3c" containerID="38cd2cfc39731f342fb3349689ca00d5bc30d60097d0344fb6d532517716b8a9" exitCode=0 Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.242571 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6217-account-create-update-tst4w" event={"ID":"8c1760e7-8de5-48fc-af90-9e8dedf53a3c","Type":"ContainerDied","Data":"38cd2cfc39731f342fb3349689ca00d5bc30d60097d0344fb6d532517716b8a9"} Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.849164 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f62wv" Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.953261 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e688d41-fb86-42a2-ae40-57d585b44357-operator-scripts\") pod \"9e688d41-fb86-42a2-ae40-57d585b44357\" (UID: \"9e688d41-fb86-42a2-ae40-57d585b44357\") " Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.953517 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt2vn\" (UniqueName: \"kubernetes.io/projected/9e688d41-fb86-42a2-ae40-57d585b44357-kube-api-access-pt2vn\") pod \"9e688d41-fb86-42a2-ae40-57d585b44357\" (UID: \"9e688d41-fb86-42a2-ae40-57d585b44357\") " Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.955500 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e688d41-fb86-42a2-ae40-57d585b44357-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e688d41-fb86-42a2-ae40-57d585b44357" (UID: "9e688d41-fb86-42a2-ae40-57d585b44357"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:13 crc kubenswrapper[4702]: I1203 11:28:13.964186 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e688d41-fb86-42a2-ae40-57d585b44357-kube-api-access-pt2vn" (OuterVolumeSpecName: "kube-api-access-pt2vn") pod "9e688d41-fb86-42a2-ae40-57d585b44357" (UID: "9e688d41-fb86-42a2-ae40-57d585b44357"). InnerVolumeSpecName "kube-api-access-pt2vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.056526 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt2vn\" (UniqueName: \"kubernetes.io/projected/9e688d41-fb86-42a2-ae40-57d585b44357-kube-api-access-pt2vn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.056580 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e688d41-fb86-42a2-ae40-57d585b44357-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.143309 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.157716 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l9cg\" (UniqueName: \"kubernetes.io/projected/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-kube-api-access-7l9cg\") pod \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\" (UID: \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\") " Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.158132 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-operator-scripts\") pod \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\" (UID: \"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d\") " Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.158894 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d" (UID: "1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.159307 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fm24s" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.165558 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-kube-api-access-7l9cg" (OuterVolumeSpecName: "kube-api-access-7l9cg") pod "1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d" (UID: "1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d"). InnerVolumeSpecName "kube-api-access-7l9cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.188287 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.257654 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-k57x9" event={"ID":"4302e064-fe40-4f25-aeb5-44b7e6449131","Type":"ContainerDied","Data":"7c5c14aada330397467a36538548aa1529eb7d1f67f1a597bcc3372b105361c5"} Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.258038 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c5c14aada330397467a36538548aa1529eb7d1f67f1a597bcc3372b105361c5" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.258397 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-k57x9" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.260627 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-f62wv" event={"ID":"9e688d41-fb86-42a2-ae40-57d585b44357","Type":"ContainerDied","Data":"e7dfceda79839b9fa4398961125cb0c9846c102cf8b38a4be799879ea904717e"} Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.260648 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7dfceda79839b9fa4398961125cb0c9846c102cf8b38a4be799879ea904717e" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.260687 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-f62wv" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.266325 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.266419 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l9cg\" (UniqueName: \"kubernetes.io/projected/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d-kube-api-access-7l9cg\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.281794 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerStarted","Data":"61826dc676438e99394cad546f4a8ecb4ae28b3a7e111ba1949826bc217d5a6c"} Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.297400 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-fm24s" event={"ID":"910db445-4971-456d-8d38-099ce65627ba","Type":"ContainerDied","Data":"fe53cda21dfec02a089c494f41b0018b4b2dbc647e866d04e07a5e6f0d9d66ea"} Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.297451 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe53cda21dfec02a089c494f41b0018b4b2dbc647e866d04e07a5e6f0d9d66ea" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.297532 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-fm24s" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.300143 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" event={"ID":"1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d","Type":"ContainerDied","Data":"1474b06476fbb73cb827ca00a0646b9dfb77eaf437506d5c63fbb332c4d8280b"} Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.300177 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1474b06476fbb73cb827ca00a0646b9dfb77eaf437506d5c63fbb332c4d8280b" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.300235 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-kg7cx" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.312119 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c","Type":"ContainerStarted","Data":"8f3a164c405e3bcb963ecdafe18b9c7d08e2a67a14287fd0dd6a702abcc9f3d8"} Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.312425 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.317428 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85f53e1b-50d1-4249-ba44-5b2e5982ae36","Type":"ContainerStarted","Data":"ede35561174b293853d63b2d4a6b0b2d7cbdbdf59506eb2c648bf64fa62ebb67"} Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.318092 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.362876 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.16612807 podStartE2EDuration="1m37.362844578s" podCreationTimestamp="2025-12-03 11:26:37 +0000 UTC" firstStartedPulling="2025-12-03 11:26:40.461508614 +0000 UTC m=+1384.297437088" lastFinishedPulling="2025-12-03 11:27:38.658225132 +0000 UTC m=+1442.494153596" observedRunningTime="2025-12-03 11:28:14.354115561 +0000 UTC m=+1478.190044045" watchObservedRunningTime="2025-12-03 11:28:14.362844578 +0000 UTC m=+1478.198773042" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.369158 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7txw\" (UniqueName: \"kubernetes.io/projected/910db445-4971-456d-8d38-099ce65627ba-kube-api-access-t7txw\") pod \"910db445-4971-456d-8d38-099ce65627ba\" (UID: \"910db445-4971-456d-8d38-099ce65627ba\") " Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.369243 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4302e064-fe40-4f25-aeb5-44b7e6449131-operator-scripts\") pod \"4302e064-fe40-4f25-aeb5-44b7e6449131\" (UID: \"4302e064-fe40-4f25-aeb5-44b7e6449131\") " Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.369489 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69bpg\" (UniqueName: \"kubernetes.io/projected/4302e064-fe40-4f25-aeb5-44b7e6449131-kube-api-access-69bpg\") pod \"4302e064-fe40-4f25-aeb5-44b7e6449131\" (UID: \"4302e064-fe40-4f25-aeb5-44b7e6449131\") " Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.369551 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910db445-4971-456d-8d38-099ce65627ba-operator-scripts\") pod \"910db445-4971-456d-8d38-099ce65627ba\" (UID: \"910db445-4971-456d-8d38-099ce65627ba\") " Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.375117 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/910db445-4971-456d-8d38-099ce65627ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "910db445-4971-456d-8d38-099ce65627ba" (UID: "910db445-4971-456d-8d38-099ce65627ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.378008 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4302e064-fe40-4f25-aeb5-44b7e6449131-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4302e064-fe40-4f25-aeb5-44b7e6449131" (UID: "4302e064-fe40-4f25-aeb5-44b7e6449131"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.391067 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910db445-4971-456d-8d38-099ce65627ba-kube-api-access-t7txw" (OuterVolumeSpecName: "kube-api-access-t7txw") pod "910db445-4971-456d-8d38-099ce65627ba" (UID: "910db445-4971-456d-8d38-099ce65627ba"). InnerVolumeSpecName "kube-api-access-t7txw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.403925 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4302e064-fe40-4f25-aeb5-44b7e6449131-kube-api-access-69bpg" (OuterVolumeSpecName: "kube-api-access-69bpg") pod "4302e064-fe40-4f25-aeb5-44b7e6449131" (UID: "4302e064-fe40-4f25-aeb5-44b7e6449131"). InnerVolumeSpecName "kube-api-access-69bpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.411121 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.315570169 podStartE2EDuration="1m37.411090613s" podCreationTimestamp="2025-12-03 11:26:37 +0000 UTC" firstStartedPulling="2025-12-03 11:26:40.005517738 +0000 UTC m=+1383.841446202" lastFinishedPulling="2025-12-03 11:27:38.101038182 +0000 UTC m=+1441.936966646" observedRunningTime="2025-12-03 11:28:14.391664583 +0000 UTC m=+1478.227593057" watchObservedRunningTime="2025-12-03 11:28:14.411090613 +0000 UTC m=+1478.247019077" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.472724 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7txw\" (UniqueName: \"kubernetes.io/projected/910db445-4971-456d-8d38-099ce65627ba-kube-api-access-t7txw\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.472792 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4302e064-fe40-4f25-aeb5-44b7e6449131-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.472807 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69bpg\" (UniqueName: \"kubernetes.io/projected/4302e064-fe40-4f25-aeb5-44b7e6449131-kube-api-access-69bpg\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.472820 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/910db445-4971-456d-8d38-099ce65627ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.914420 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.984037 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d72b70-ab4b-44db-a489-5daf18efbf68-operator-scripts\") pod \"b7d72b70-ab4b-44db-a489-5daf18efbf68\" (UID: \"b7d72b70-ab4b-44db-a489-5daf18efbf68\") " Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.984271 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpx7k\" (UniqueName: \"kubernetes.io/projected/b7d72b70-ab4b-44db-a489-5daf18efbf68-kube-api-access-wpx7k\") pod \"b7d72b70-ab4b-44db-a489-5daf18efbf68\" (UID: \"b7d72b70-ab4b-44db-a489-5daf18efbf68\") " Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.985355 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7d72b70-ab4b-44db-a489-5daf18efbf68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7d72b70-ab4b-44db-a489-5daf18efbf68" (UID: "b7d72b70-ab4b-44db-a489-5daf18efbf68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:14 crc kubenswrapper[4702]: I1203 11:28:14.992008 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d72b70-ab4b-44db-a489-5daf18efbf68-kube-api-access-wpx7k" (OuterVolumeSpecName: "kube-api-access-wpx7k") pod "b7d72b70-ab4b-44db-a489-5daf18efbf68" (UID: "b7d72b70-ab4b-44db-a489-5daf18efbf68"). InnerVolumeSpecName "kube-api-access-wpx7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.063090 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.068072 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.086946 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7d72b70-ab4b-44db-a489-5daf18efbf68-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.086990 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpx7k\" (UniqueName: \"kubernetes.io/projected/b7d72b70-ab4b-44db-a489-5daf18efbf68-kube-api-access-wpx7k\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.135437 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.188123 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jljpn\" (UniqueName: \"kubernetes.io/projected/04333214-a420-40c6-bcd4-0d50544955ec-kube-api-access-jljpn\") pod \"04333214-a420-40c6-bcd4-0d50544955ec\" (UID: \"04333214-a420-40c6-bcd4-0d50544955ec\") " Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.188258 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04333214-a420-40c6-bcd4-0d50544955ec-operator-scripts\") pod \"04333214-a420-40c6-bcd4-0d50544955ec\" (UID: \"04333214-a420-40c6-bcd4-0d50544955ec\") " Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.188417 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcnhj\" (UniqueName: \"kubernetes.io/projected/6cdc33be-f901-4659-a687-a547acd212c0-kube-api-access-pcnhj\") pod \"6cdc33be-f901-4659-a687-a547acd212c0\" (UID: \"6cdc33be-f901-4659-a687-a547acd212c0\") " Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.188467 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cdc33be-f901-4659-a687-a547acd212c0-operator-scripts\") pod \"6cdc33be-f901-4659-a687-a547acd212c0\" (UID: \"6cdc33be-f901-4659-a687-a547acd212c0\") " Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.189135 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04333214-a420-40c6-bcd4-0d50544955ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04333214-a420-40c6-bcd4-0d50544955ec" (UID: "04333214-a420-40c6-bcd4-0d50544955ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.189511 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cdc33be-f901-4659-a687-a547acd212c0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6cdc33be-f901-4659-a687-a547acd212c0" (UID: "6cdc33be-f901-4659-a687-a547acd212c0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.209851 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04333214-a420-40c6-bcd4-0d50544955ec-kube-api-access-jljpn" (OuterVolumeSpecName: "kube-api-access-jljpn") pod "04333214-a420-40c6-bcd4-0d50544955ec" (UID: "04333214-a420-40c6-bcd4-0d50544955ec"). InnerVolumeSpecName "kube-api-access-jljpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.209984 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cdc33be-f901-4659-a687-a547acd212c0-kube-api-access-pcnhj" (OuterVolumeSpecName: "kube-api-access-pcnhj") pod "6cdc33be-f901-4659-a687-a547acd212c0" (UID: "6cdc33be-f901-4659-a687-a547acd212c0"). InnerVolumeSpecName "kube-api-access-pcnhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.290577 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ptdq\" (UniqueName: \"kubernetes.io/projected/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-kube-api-access-2ptdq\") pod \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\" (UID: \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\") " Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.291559 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-operator-scripts\") pod \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\" (UID: \"8c1760e7-8de5-48fc-af90-9e8dedf53a3c\") " Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.292206 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c1760e7-8de5-48fc-af90-9e8dedf53a3c" (UID: "8c1760e7-8de5-48fc-af90-9e8dedf53a3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.292626 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcnhj\" (UniqueName: \"kubernetes.io/projected/6cdc33be-f901-4659-a687-a547acd212c0-kube-api-access-pcnhj\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.294719 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6cdc33be-f901-4659-a687-a547acd212c0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.295869 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jljpn\" (UniqueName: \"kubernetes.io/projected/04333214-a420-40c6-bcd4-0d50544955ec-kube-api-access-jljpn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.295968 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04333214-a420-40c6-bcd4-0d50544955ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.301873 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.302108 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-kube-api-access-2ptdq" (OuterVolumeSpecName: "kube-api-access-2ptdq") pod "8c1760e7-8de5-48fc-af90-9e8dedf53a3c" (UID: "8c1760e7-8de5-48fc-af90-9e8dedf53a3c"). InnerVolumeSpecName "kube-api-access-2ptdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.366575 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-c7c0-account-create-update-qf8t8" event={"ID":"04333214-a420-40c6-bcd4-0d50544955ec","Type":"ContainerDied","Data":"f9b99aaf77dbdbd1eb9c1365c7d43f4a61a5c45166fdb8358f78e0f77565d40d"} Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.366603 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-c7c0-account-create-update-qf8t8" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.366618 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9b99aaf77dbdbd1eb9c1365c7d43f4a61a5c45166fdb8358f78e0f77565d40d" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.368325 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d28a-account-create-update-25ttw" event={"ID":"b7d72b70-ab4b-44db-a489-5daf18efbf68","Type":"ContainerDied","Data":"91718785eb96600b8afbeed54035f25e502c541ac601556fcd9e0bd06116e330"} Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.368357 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91718785eb96600b8afbeed54035f25e502c541ac601556fcd9e0bd06116e330" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.368415 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d28a-account-create-update-25ttw" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.370635 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6217-account-create-update-tst4w" event={"ID":"8c1760e7-8de5-48fc-af90-9e8dedf53a3c","Type":"ContainerDied","Data":"57464916f58e1bc7cb43dc2922d31dcbe54d693fa5e19648fb96f8963ff4b020"} Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.370658 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57464916f58e1bc7cb43dc2922d31dcbe54d693fa5e19648fb96f8963ff4b020" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.370702 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6217-account-create-update-tst4w" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.404513 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ptdq\" (UniqueName: \"kubernetes.io/projected/8c1760e7-8de5-48fc-af90-9e8dedf53a3c-kube-api-access-2ptdq\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.406857 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.412586 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-e7ea-account-create-update-mb6qb" event={"ID":"6cdc33be-f901-4659-a687-a547acd212c0","Type":"ContainerDied","Data":"6f6aa722f51e7563a99ceb3dd51bd087d003162376e5eae36118a5fe1b2d81ca"} Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.412625 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f6aa722f51e7563a99ceb3dd51bd087d003162376e5eae36118a5fe1b2d81ca" Dec 03 11:28:15 crc kubenswrapper[4702]: I1203 11:28:15.811511 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5v7lw" podUID="e77b1727-1835-42aa-a4f6-d902ff001d20" containerName="ovn-controller" probeResult="failure" output=< Dec 03 11:28:15 crc kubenswrapper[4702]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 11:28:15 crc kubenswrapper[4702]: > Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.520810 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hq9rm"] Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521601 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d72b70-ab4b-44db-a489-5daf18efbf68" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521617 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d72b70-ab4b-44db-a489-5daf18efbf68" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521627 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4302e064-fe40-4f25-aeb5-44b7e6449131" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521633 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4302e064-fe40-4f25-aeb5-44b7e6449131" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521659 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521666 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521683 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="dnsmasq-dns" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521688 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="dnsmasq-dns" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521700 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910db445-4971-456d-8d38-099ce65627ba" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521706 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="910db445-4971-456d-8d38-099ce65627ba" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521718 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e688d41-fb86-42a2-ae40-57d585b44357" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521724 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e688d41-fb86-42a2-ae40-57d585b44357" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521736 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="init" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521742 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="init" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521770 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cdc33be-f901-4659-a687-a547acd212c0" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521777 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cdc33be-f901-4659-a687-a547acd212c0" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521796 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1760e7-8de5-48fc-af90-9e8dedf53a3c" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521804 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1760e7-8de5-48fc-af90-9e8dedf53a3c" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: E1203 11:28:17.521815 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04333214-a420-40c6-bcd4-0d50544955ec" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.521823 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="04333214-a420-40c6-bcd4-0d50544955ec" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522086 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="04333214-a420-40c6-bcd4-0d50544955ec" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522106 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="910db445-4971-456d-8d38-099ce65627ba" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522119 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1760e7-8de5-48fc-af90-9e8dedf53a3c" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522130 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d72b70-ab4b-44db-a489-5daf18efbf68" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522141 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522150 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4302e064-fe40-4f25-aeb5-44b7e6449131" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522159 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e06f10-c712-4280-8d28-e8b44ac0810c" containerName="dnsmasq-dns" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522171 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e688d41-fb86-42a2-ae40-57d585b44357" containerName="mariadb-database-create" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522180 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cdc33be-f901-4659-a687-a547acd212c0" containerName="mariadb-account-create-update" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.522983 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.525299 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wngfr" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.531678 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.542700 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hq9rm"] Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.653833 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-db-sync-config-data\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.654014 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-combined-ca-bundle\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.654109 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhj4j\" (UniqueName: \"kubernetes.io/projected/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-kube-api-access-lhj4j\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.654303 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-config-data\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.757143 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-combined-ca-bundle\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.757341 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhj4j\" (UniqueName: \"kubernetes.io/projected/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-kube-api-access-lhj4j\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.757484 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-config-data\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.757667 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-db-sync-config-data\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.762488 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-combined-ca-bundle\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.762954 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-db-sync-config-data\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.782651 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhj4j\" (UniqueName: \"kubernetes.io/projected/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-kube-api-access-lhj4j\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.782851 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-config-data\") pod \"glance-db-sync-hq9rm\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:17 crc kubenswrapper[4702]: I1203 11:28:17.847973 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hq9rm" Dec 03 11:28:18 crc kubenswrapper[4702]: I1203 11:28:18.063198 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:28:18 crc kubenswrapper[4702]: E1203 11:28:18.063517 4702 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 11:28:18 crc kubenswrapper[4702]: E1203 11:28:18.063556 4702 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 11:28:18 crc kubenswrapper[4702]: E1203 11:28:18.063633 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift podName:3892571c-86ff-4259-beaa-6033dcfda204 nodeName:}" failed. No retries permitted until 2025-12-03 11:28:50.063602798 +0000 UTC m=+1513.899531262 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift") pod "swift-storage-0" (UID: "3892571c-86ff-4259-beaa-6033dcfda204") : configmap "swift-ring-files" not found Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.453481 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5"] Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.456097 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.469328 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5"] Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.576534 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e712d5-563a-475e-9811-f8e005cee912-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-gfmp5\" (UID: \"17e712d5-563a-475e-9811-f8e005cee912\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.576592 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwd4n\" (UniqueName: \"kubernetes.io/projected/17e712d5-563a-475e-9811-f8e005cee912-kube-api-access-hwd4n\") pod \"mysqld-exporter-openstack-cell1-db-create-gfmp5\" (UID: \"17e712d5-563a-475e-9811-f8e005cee912\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.679183 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e712d5-563a-475e-9811-f8e005cee912-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-gfmp5\" (UID: \"17e712d5-563a-475e-9811-f8e005cee912\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.679248 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwd4n\" (UniqueName: \"kubernetes.io/projected/17e712d5-563a-475e-9811-f8e005cee912-kube-api-access-hwd4n\") pod \"mysqld-exporter-openstack-cell1-db-create-gfmp5\" (UID: \"17e712d5-563a-475e-9811-f8e005cee912\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.680479 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e712d5-563a-475e-9811-f8e005cee912-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-gfmp5\" (UID: \"17e712d5-563a-475e-9811-f8e005cee912\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.704215 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-b754-account-create-update-k7lzh"] Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.705623 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.716043 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-b754-account-create-update-k7lzh"] Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.716892 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.732124 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwd4n\" (UniqueName: \"kubernetes.io/projected/17e712d5-563a-475e-9811-f8e005cee912-kube-api-access-hwd4n\") pod \"mysqld-exporter-openstack-cell1-db-create-gfmp5\" (UID: \"17e712d5-563a-475e-9811-f8e005cee912\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.781452 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-operator-scripts\") pod \"mysqld-exporter-b754-account-create-update-k7lzh\" (UID: \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\") " pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.781664 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxp8h\" (UniqueName: \"kubernetes.io/projected/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-kube-api-access-dxp8h\") pod \"mysqld-exporter-b754-account-create-update-k7lzh\" (UID: \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\") " pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.786475 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.846228 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5v7lw" podUID="e77b1727-1835-42aa-a4f6-d902ff001d20" containerName="ovn-controller" probeResult="failure" output=< Dec 03 11:28:20 crc kubenswrapper[4702]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 11:28:20 crc kubenswrapper[4702]: > Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.863965 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.883866 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxp8h\" (UniqueName: \"kubernetes.io/projected/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-kube-api-access-dxp8h\") pod \"mysqld-exporter-b754-account-create-update-k7lzh\" (UID: \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\") " pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.884019 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-operator-scripts\") pod \"mysqld-exporter-b754-account-create-update-k7lzh\" (UID: \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\") " pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.884905 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-operator-scripts\") pod \"mysqld-exporter-b754-account-create-update-k7lzh\" (UID: \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\") " pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.886258 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-zqdp5" Dec 03 11:28:20 crc kubenswrapper[4702]: I1203 11:28:20.909506 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxp8h\" (UniqueName: \"kubernetes.io/projected/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-kube-api-access-dxp8h\") pod \"mysqld-exporter-b754-account-create-update-k7lzh\" (UID: \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\") " pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.054787 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hq9rm"] Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.132550 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5v7lw-config-bzwss"] Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.134280 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.137581 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.154938 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.156011 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5v7lw-config-bzwss"] Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.192816 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-scripts\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.192966 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.193030 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run-ovn\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.193249 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-log-ovn\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.193308 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dtwh\" (UniqueName: \"kubernetes.io/projected/78118123-98b7-46ee-bd4c-21d79041a179-kube-api-access-5dtwh\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.193343 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-additional-scripts\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.352108 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-log-ovn\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.352164 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dtwh\" (UniqueName: \"kubernetes.io/projected/78118123-98b7-46ee-bd4c-21d79041a179-kube-api-access-5dtwh\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.352193 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-additional-scripts\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.352253 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-scripts\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.352284 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.352332 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run-ovn\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.352663 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run-ovn\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.352736 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-log-ovn\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.353189 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-additional-scripts\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.353275 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.354553 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-scripts\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.396705 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dtwh\" (UniqueName: \"kubernetes.io/projected/78118123-98b7-46ee-bd4c-21d79041a179-kube-api-access-5dtwh\") pod \"ovn-controller-5v7lw-config-bzwss\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:21 crc kubenswrapper[4702]: I1203 11:28:21.455876 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:22 crc kubenswrapper[4702]: I1203 11:28:22.513623 4702 generic.go:334] "Generic (PLEG): container finished" podID="dcc463a3-c5e5-443e-98d1-306cc779e62e" containerID="28224786c89d5db7db660f5c395137af53230e70a07c75b094ef15a9446b3f8a" exitCode=0 Dec 03 11:28:22 crc kubenswrapper[4702]: I1203 11:28:22.513712 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-whljp" event={"ID":"dcc463a3-c5e5-443e-98d1-306cc779e62e","Type":"ContainerDied","Data":"28224786c89d5db7db660f5c395137af53230e70a07c75b094ef15a9446b3f8a"} Dec 03 11:28:23 crc kubenswrapper[4702]: E1203 11:28:23.367453 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/prometheus-metric-storage-0" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" Dec 03 11:28:23 crc kubenswrapper[4702]: I1203 11:28:23.486620 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5"] Dec 03 11:28:23 crc kubenswrapper[4702]: W1203 11:28:23.487610 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod17e712d5_563a_475e_9811_f8e005cee912.slice/crio-2c867f8dcc40f09f71cd267d84056114865c7d8fd1f229fde80dd6e880a16aa5 WatchSource:0}: Error finding container 2c867f8dcc40f09f71cd267d84056114865c7d8fd1f229fde80dd6e880a16aa5: Status 404 returned error can't find the container with id 2c867f8dcc40f09f71cd267d84056114865c7d8fd1f229fde80dd6e880a16aa5 Dec 03 11:28:23 crc kubenswrapper[4702]: I1203 11:28:23.524711 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" event={"ID":"17e712d5-563a-475e-9811-f8e005cee912","Type":"ContainerStarted","Data":"2c867f8dcc40f09f71cd267d84056114865c7d8fd1f229fde80dd6e880a16aa5"} Dec 03 11:28:23 crc kubenswrapper[4702]: I1203 11:28:23.543065 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hq9rm" event={"ID":"34d869ae-eae9-4bf6-b05e-cf40504ccdb6","Type":"ContainerStarted","Data":"8d478e150cefee053fd0fab217dda46aeb338f521b07fcbe667c939d0bd9d2f3"} Dec 03 11:28:23 crc kubenswrapper[4702]: I1203 11:28:23.546311 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerStarted","Data":"88f3286e58a1dc879bcffc00fdf2a430841087a91991b7ba4a0d1818edda3fba"} Dec 03 11:28:23 crc kubenswrapper[4702]: I1203 11:28:23.621315 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-b754-account-create-update-k7lzh"] Dec 03 11:28:23 crc kubenswrapper[4702]: I1203 11:28:23.627902 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5v7lw-config-bzwss"] Dec 03 11:28:23 crc kubenswrapper[4702]: I1203 11:28:23.919046 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.035860 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-ring-data-devices\") pod \"dcc463a3-c5e5-443e-98d1-306cc779e62e\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.036043 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-combined-ca-bundle\") pod \"dcc463a3-c5e5-443e-98d1-306cc779e62e\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.036124 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-dispersionconf\") pod \"dcc463a3-c5e5-443e-98d1-306cc779e62e\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.036149 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck2qr\" (UniqueName: \"kubernetes.io/projected/dcc463a3-c5e5-443e-98d1-306cc779e62e-kube-api-access-ck2qr\") pod \"dcc463a3-c5e5-443e-98d1-306cc779e62e\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.036221 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcc463a3-c5e5-443e-98d1-306cc779e62e-etc-swift\") pod \"dcc463a3-c5e5-443e-98d1-306cc779e62e\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.036274 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-scripts\") pod \"dcc463a3-c5e5-443e-98d1-306cc779e62e\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.036333 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-swiftconf\") pod \"dcc463a3-c5e5-443e-98d1-306cc779e62e\" (UID: \"dcc463a3-c5e5-443e-98d1-306cc779e62e\") " Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.036546 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dcc463a3-c5e5-443e-98d1-306cc779e62e" (UID: "dcc463a3-c5e5-443e-98d1-306cc779e62e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.037470 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcc463a3-c5e5-443e-98d1-306cc779e62e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dcc463a3-c5e5-443e-98d1-306cc779e62e" (UID: "dcc463a3-c5e5-443e-98d1-306cc779e62e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.038876 4702 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dcc463a3-c5e5-443e-98d1-306cc779e62e-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.038909 4702 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.059111 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc463a3-c5e5-443e-98d1-306cc779e62e-kube-api-access-ck2qr" (OuterVolumeSpecName: "kube-api-access-ck2qr") pod "dcc463a3-c5e5-443e-98d1-306cc779e62e" (UID: "dcc463a3-c5e5-443e-98d1-306cc779e62e"). InnerVolumeSpecName "kube-api-access-ck2qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.063745 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dcc463a3-c5e5-443e-98d1-306cc779e62e" (UID: "dcc463a3-c5e5-443e-98d1-306cc779e62e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.065464 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-scripts" (OuterVolumeSpecName: "scripts") pod "dcc463a3-c5e5-443e-98d1-306cc779e62e" (UID: "dcc463a3-c5e5-443e-98d1-306cc779e62e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.065711 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcc463a3-c5e5-443e-98d1-306cc779e62e" (UID: "dcc463a3-c5e5-443e-98d1-306cc779e62e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.090011 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dcc463a3-c5e5-443e-98d1-306cc779e62e" (UID: "dcc463a3-c5e5-443e-98d1-306cc779e62e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.141005 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.141046 4702 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.141061 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck2qr\" (UniqueName: \"kubernetes.io/projected/dcc463a3-c5e5-443e-98d1-306cc779e62e-kube-api-access-ck2qr\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.141076 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc463a3-c5e5-443e-98d1-306cc779e62e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.141086 4702 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dcc463a3-c5e5-443e-98d1-306cc779e62e-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.560622 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" event={"ID":"e3b687a4-b074-4490-bb5a-c218ad3f6c9c","Type":"ContainerStarted","Data":"c357dac314c39eed2710ff6a6c4eefee442389e5f1ab6a469d9eb71728c632c4"} Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.560692 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" event={"ID":"e3b687a4-b074-4490-bb5a-c218ad3f6c9c","Type":"ContainerStarted","Data":"71ce4a04133dc62e558e7651f800385f73b33a793c897905442160e766a4ec0e"} Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.695262 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-bzwss" event={"ID":"78118123-98b7-46ee-bd4c-21d79041a179","Type":"ContainerStarted","Data":"d4b436d57fea09c21957601613ce4d8daa58090505dc302743cb7d66267f683d"} Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.695326 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-bzwss" event={"ID":"78118123-98b7-46ee-bd4c-21d79041a179","Type":"ContainerStarted","Data":"cb87f37e81d3fc727dd33bcf0d78b26608468d2b89d81382ffd5ae618e9fdc0c"} Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.697319 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" podStartSLOduration=4.697294563 podStartE2EDuration="4.697294563s" podCreationTimestamp="2025-12-03 11:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:24.696344096 +0000 UTC m=+1488.532272560" watchObservedRunningTime="2025-12-03 11:28:24.697294563 +0000 UTC m=+1488.533223027" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.697716 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" event={"ID":"17e712d5-563a-475e-9811-f8e005cee912","Type":"ContainerStarted","Data":"9237a9ded8ef52a118f139125f96cd624fb1a2f0f71f756cd1828e5742393505"} Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.705141 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-whljp" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.705126 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-whljp" event={"ID":"dcc463a3-c5e5-443e-98d1-306cc779e62e","Type":"ContainerDied","Data":"f5dcae6599828509458bbbd7430ca1237c1eea00f5de6d4d06631aaaf5c6b2af"} Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.705299 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5dcae6599828509458bbbd7430ca1237c1eea00f5de6d4d06631aaaf5c6b2af" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.727197 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5v7lw-config-bzwss" podStartSLOduration=3.727170869 podStartE2EDuration="3.727170869s" podCreationTimestamp="2025-12-03 11:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:24.720536651 +0000 UTC m=+1488.556465135" watchObservedRunningTime="2025-12-03 11:28:24.727170869 +0000 UTC m=+1488.563099333" Dec 03 11:28:24 crc kubenswrapper[4702]: I1203 11:28:24.749581 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" podStartSLOduration=4.749557923 podStartE2EDuration="4.749557923s" podCreationTimestamp="2025-12-03 11:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:24.73816198 +0000 UTC m=+1488.574090444" watchObservedRunningTime="2025-12-03 11:28:24.749557923 +0000 UTC m=+1488.585486387" Dec 03 11:28:25 crc kubenswrapper[4702]: I1203 11:28:25.717199 4702 generic.go:334] "Generic (PLEG): container finished" podID="78118123-98b7-46ee-bd4c-21d79041a179" containerID="d4b436d57fea09c21957601613ce4d8daa58090505dc302743cb7d66267f683d" exitCode=0 Dec 03 11:28:25 crc kubenswrapper[4702]: I1203 11:28:25.717276 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-bzwss" event={"ID":"78118123-98b7-46ee-bd4c-21d79041a179","Type":"ContainerDied","Data":"d4b436d57fea09c21957601613ce4d8daa58090505dc302743cb7d66267f683d"} Dec 03 11:28:25 crc kubenswrapper[4702]: I1203 11:28:25.719191 4702 generic.go:334] "Generic (PLEG): container finished" podID="17e712d5-563a-475e-9811-f8e005cee912" containerID="9237a9ded8ef52a118f139125f96cd624fb1a2f0f71f756cd1828e5742393505" exitCode=0 Dec 03 11:28:25 crc kubenswrapper[4702]: I1203 11:28:25.719560 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" event={"ID":"17e712d5-563a-475e-9811-f8e005cee912","Type":"ContainerDied","Data":"9237a9ded8ef52a118f139125f96cd624fb1a2f0f71f756cd1828e5742393505"} Dec 03 11:28:25 crc kubenswrapper[4702]: I1203 11:28:25.809429 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5v7lw" Dec 03 11:28:25 crc kubenswrapper[4702]: I1203 11:28:25.908492 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:28:25 crc kubenswrapper[4702]: I1203 11:28:25.908566 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.219939 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.237047 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.336876 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e712d5-563a-475e-9811-f8e005cee912-operator-scripts\") pod \"17e712d5-563a-475e-9811-f8e005cee912\" (UID: \"17e712d5-563a-475e-9811-f8e005cee912\") " Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.336987 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-additional-scripts\") pod \"78118123-98b7-46ee-bd4c-21d79041a179\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.337012 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-log-ovn\") pod \"78118123-98b7-46ee-bd4c-21d79041a179\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.337066 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run\") pod \"78118123-98b7-46ee-bd4c-21d79041a179\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.337095 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dtwh\" (UniqueName: \"kubernetes.io/projected/78118123-98b7-46ee-bd4c-21d79041a179-kube-api-access-5dtwh\") pod \"78118123-98b7-46ee-bd4c-21d79041a179\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.337164 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-scripts\") pod \"78118123-98b7-46ee-bd4c-21d79041a179\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.337194 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwd4n\" (UniqueName: \"kubernetes.io/projected/17e712d5-563a-475e-9811-f8e005cee912-kube-api-access-hwd4n\") pod \"17e712d5-563a-475e-9811-f8e005cee912\" (UID: \"17e712d5-563a-475e-9811-f8e005cee912\") " Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.337218 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run-ovn\") pod \"78118123-98b7-46ee-bd4c-21d79041a179\" (UID: \"78118123-98b7-46ee-bd4c-21d79041a179\") " Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.337783 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "78118123-98b7-46ee-bd4c-21d79041a179" (UID: "78118123-98b7-46ee-bd4c-21d79041a179"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.338255 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run" (OuterVolumeSpecName: "var-run") pod "78118123-98b7-46ee-bd4c-21d79041a179" (UID: "78118123-98b7-46ee-bd4c-21d79041a179"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.339419 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "78118123-98b7-46ee-bd4c-21d79041a179" (UID: "78118123-98b7-46ee-bd4c-21d79041a179"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.339500 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "78118123-98b7-46ee-bd4c-21d79041a179" (UID: "78118123-98b7-46ee-bd4c-21d79041a179"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.339521 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-scripts" (OuterVolumeSpecName: "scripts") pod "78118123-98b7-46ee-bd4c-21d79041a179" (UID: "78118123-98b7-46ee-bd4c-21d79041a179"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.339749 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17e712d5-563a-475e-9811-f8e005cee912-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17e712d5-563a-475e-9811-f8e005cee912" (UID: "17e712d5-563a-475e-9811-f8e005cee912"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.347271 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e712d5-563a-475e-9811-f8e005cee912-kube-api-access-hwd4n" (OuterVolumeSpecName: "kube-api-access-hwd4n") pod "17e712d5-563a-475e-9811-f8e005cee912" (UID: "17e712d5-563a-475e-9811-f8e005cee912"). InnerVolumeSpecName "kube-api-access-hwd4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.346924 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78118123-98b7-46ee-bd4c-21d79041a179-kube-api-access-5dtwh" (OuterVolumeSpecName: "kube-api-access-5dtwh") pod "78118123-98b7-46ee-bd4c-21d79041a179" (UID: "78118123-98b7-46ee-bd4c-21d79041a179"). InnerVolumeSpecName "kube-api-access-5dtwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.438865 4702 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.438908 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dtwh\" (UniqueName: \"kubernetes.io/projected/78118123-98b7-46ee-bd4c-21d79041a179-kube-api-access-5dtwh\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.438925 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.438936 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwd4n\" (UniqueName: \"kubernetes.io/projected/17e712d5-563a-475e-9811-f8e005cee912-kube-api-access-hwd4n\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.438947 4702 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.438963 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17e712d5-563a-475e-9811-f8e005cee912-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.438974 4702 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78118123-98b7-46ee-bd4c-21d79041a179-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.438987 4702 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78118123-98b7-46ee-bd4c-21d79041a179-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.751915 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-bzwss" event={"ID":"78118123-98b7-46ee-bd4c-21d79041a179","Type":"ContainerDied","Data":"cb87f37e81d3fc727dd33bcf0d78b26608468d2b89d81382ffd5ae618e9fdc0c"} Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.751977 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb87f37e81d3fc727dd33bcf0d78b26608468d2b89d81382ffd5ae618e9fdc0c" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.751938 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-bzwss" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.755178 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" event={"ID":"17e712d5-563a-475e-9811-f8e005cee912","Type":"ContainerDied","Data":"2c867f8dcc40f09f71cd267d84056114865c7d8fd1f229fde80dd6e880a16aa5"} Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.755218 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.755230 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c867f8dcc40f09f71cd267d84056114865c7d8fd1f229fde80dd6e880a16aa5" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.759014 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerStarted","Data":"b6ec0beee3d91f801d459872b27f814b8294b3c0226d5e80def62cd95e4b099b"} Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.764742 4702 generic.go:334] "Generic (PLEG): container finished" podID="e3b687a4-b074-4490-bb5a-c218ad3f6c9c" containerID="c357dac314c39eed2710ff6a6c4eefee442389e5f1ab6a469d9eb71728c632c4" exitCode=0 Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.764796 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" event={"ID":"e3b687a4-b074-4490-bb5a-c218ad3f6c9c","Type":"ContainerDied","Data":"c357dac314c39eed2710ff6a6c4eefee442389e5f1ab6a469d9eb71728c632c4"} Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.797964 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=5.338056404 podStartE2EDuration="1m44.7979377s" podCreationTimestamp="2025-12-03 11:26:44 +0000 UTC" firstStartedPulling="2025-12-03 11:26:48.482160302 +0000 UTC m=+1392.318088766" lastFinishedPulling="2025-12-03 11:28:27.942041598 +0000 UTC m=+1491.777970062" observedRunningTime="2025-12-03 11:28:28.78417452 +0000 UTC m=+1492.620103004" watchObservedRunningTime="2025-12-03 11:28:28.7979377 +0000 UTC m=+1492.633866164" Dec 03 11:28:28 crc kubenswrapper[4702]: I1203 11:28:28.947821 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.335457 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-nccll"] Dec 03 11:28:29 crc kubenswrapper[4702]: E1203 11:28:29.336056 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc463a3-c5e5-443e-98d1-306cc779e62e" containerName="swift-ring-rebalance" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.336078 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc463a3-c5e5-443e-98d1-306cc779e62e" containerName="swift-ring-rebalance" Dec 03 11:28:29 crc kubenswrapper[4702]: E1203 11:28:29.336093 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e712d5-563a-475e-9811-f8e005cee912" containerName="mariadb-database-create" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.336101 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e712d5-563a-475e-9811-f8e005cee912" containerName="mariadb-database-create" Dec 03 11:28:29 crc kubenswrapper[4702]: E1203 11:28:29.336116 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78118123-98b7-46ee-bd4c-21d79041a179" containerName="ovn-config" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.336129 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="78118123-98b7-46ee-bd4c-21d79041a179" containerName="ovn-config" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.336412 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e712d5-563a-475e-9811-f8e005cee912" containerName="mariadb-database-create" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.336432 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc463a3-c5e5-443e-98d1-306cc779e62e" containerName="swift-ring-rebalance" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.336445 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="78118123-98b7-46ee-bd4c-21d79041a179" containerName="ovn-config" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.337341 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nccll" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.348334 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nccll"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.406571 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5v7lw-config-bzwss"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.432559 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5v7lw-config-bzwss"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.442708 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nd8sv"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.444461 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.464153 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nd8sv"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.476699 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-operator-scripts\") pod \"heat-db-create-nccll\" (UID: \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\") " pod="openstack/heat-db-create-nccll" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.476925 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kls48\" (UniqueName: \"kubernetes.io/projected/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-kube-api-access-kls48\") pod \"heat-db-create-nccll\" (UID: \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\") " pod="openstack/heat-db-create-nccll" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.512878 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5v7lw-config-bt5gr"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.517739 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.530726 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.534133 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.545469 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-pdhc7"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.547669 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.574467 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5v7lw-config-bt5gr"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.580453 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kls48\" (UniqueName: \"kubernetes.io/projected/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-kube-api-access-kls48\") pod \"heat-db-create-nccll\" (UID: \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\") " pod="openstack/heat-db-create-nccll" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.580578 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-operator-scripts\") pod \"heat-db-create-nccll\" (UID: \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\") " pod="openstack/heat-db-create-nccll" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.580632 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400f59d1-8b01-45e9-84b0-173a5fa761d5-operator-scripts\") pod \"cinder-db-create-nd8sv\" (UID: \"400f59d1-8b01-45e9-84b0-173a5fa761d5\") " pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.580654 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae1887ec-19fc-43a3-ab93-481f83e4a190-operator-scripts\") pod \"barbican-db-create-pdhc7\" (UID: \"ae1887ec-19fc-43a3-ab93-481f83e4a190\") " pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.580680 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slbbp\" (UniqueName: \"kubernetes.io/projected/ae1887ec-19fc-43a3-ab93-481f83e4a190-kube-api-access-slbbp\") pod \"barbican-db-create-pdhc7\" (UID: \"ae1887ec-19fc-43a3-ab93-481f83e4a190\") " pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.580787 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxplh\" (UniqueName: \"kubernetes.io/projected/400f59d1-8b01-45e9-84b0-173a5fa761d5-kube-api-access-mxplh\") pod \"cinder-db-create-nd8sv\" (UID: \"400f59d1-8b01-45e9-84b0-173a5fa761d5\") " pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.581912 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-operator-scripts\") pod \"heat-db-create-nccll\" (UID: \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\") " pod="openstack/heat-db-create-nccll" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.624096 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kls48\" (UniqueName: \"kubernetes.io/projected/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-kube-api-access-kls48\") pod \"heat-db-create-nccll\" (UID: \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\") " pod="openstack/heat-db-create-nccll" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.677119 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pdhc7"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.684039 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slbbp\" (UniqueName: \"kubernetes.io/projected/ae1887ec-19fc-43a3-ab93-481f83e4a190-kube-api-access-slbbp\") pod \"barbican-db-create-pdhc7\" (UID: \"ae1887ec-19fc-43a3-ab93-481f83e4a190\") " pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.684108 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5qbn\" (UniqueName: \"kubernetes.io/projected/583afa7e-d99c-4768-af3b-018884bde7a9-kube-api-access-p5qbn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.684170 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxplh\" (UniqueName: \"kubernetes.io/projected/400f59d1-8b01-45e9-84b0-173a5fa761d5-kube-api-access-mxplh\") pod \"cinder-db-create-nd8sv\" (UID: \"400f59d1-8b01-45e9-84b0-173a5fa761d5\") " pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.684453 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run-ovn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.684595 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-log-ovn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.684773 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-scripts\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.684869 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-additional-scripts\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.684945 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.685054 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400f59d1-8b01-45e9-84b0-173a5fa761d5-operator-scripts\") pod \"cinder-db-create-nd8sv\" (UID: \"400f59d1-8b01-45e9-84b0-173a5fa761d5\") " pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.685097 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae1887ec-19fc-43a3-ab93-481f83e4a190-operator-scripts\") pod \"barbican-db-create-pdhc7\" (UID: \"ae1887ec-19fc-43a3-ab93-481f83e4a190\") " pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.686869 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400f59d1-8b01-45e9-84b0-173a5fa761d5-operator-scripts\") pod \"cinder-db-create-nd8sv\" (UID: \"400f59d1-8b01-45e9-84b0-173a5fa761d5\") " pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.687138 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae1887ec-19fc-43a3-ab93-481f83e4a190-operator-scripts\") pod \"barbican-db-create-pdhc7\" (UID: \"ae1887ec-19fc-43a3-ab93-481f83e4a190\") " pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.691900 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-5cea-account-create-update-mjzq9"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.700717 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.703882 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.727662 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxplh\" (UniqueName: \"kubernetes.io/projected/400f59d1-8b01-45e9-84b0-173a5fa761d5-kube-api-access-mxplh\") pod \"cinder-db-create-nd8sv\" (UID: \"400f59d1-8b01-45e9-84b0-173a5fa761d5\") " pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.733957 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5cea-account-create-update-mjzq9"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.768900 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slbbp\" (UniqueName: \"kubernetes.io/projected/ae1887ec-19fc-43a3-ab93-481f83e4a190-kube-api-access-slbbp\") pod \"barbican-db-create-pdhc7\" (UID: \"ae1887ec-19fc-43a3-ab93-481f83e4a190\") " pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.786968 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-additional-scripts\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.787632 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-additional-scripts\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.787788 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.788209 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.788346 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjrkg\" (UniqueName: \"kubernetes.io/projected/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-kube-api-access-fjrkg\") pod \"cinder-5cea-account-create-update-mjzq9\" (UID: \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\") " pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.788370 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5qbn\" (UniqueName: \"kubernetes.io/projected/583afa7e-d99c-4768-af3b-018884bde7a9-kube-api-access-p5qbn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.788957 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-operator-scripts\") pod \"cinder-5cea-account-create-update-mjzq9\" (UID: \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\") " pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.789140 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run-ovn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.789275 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run-ovn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.790710 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-log-ovn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.790854 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-scripts\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.791960 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-log-ovn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.794779 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-scripts\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.810232 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nccll" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.812151 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-3858-account-create-update-58hz9"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.830088 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.837807 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5qbn\" (UniqueName: \"kubernetes.io/projected/583afa7e-d99c-4768-af3b-018884bde7a9-kube-api-access-p5qbn\") pod \"ovn-controller-5v7lw-config-bt5gr\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.838199 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.852363 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.875036 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-3858-account-create-update-58hz9"] Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.877400 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.892195 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.894403 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjrkg\" (UniqueName: \"kubernetes.io/projected/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-kube-api-access-fjrkg\") pod \"cinder-5cea-account-create-update-mjzq9\" (UID: \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\") " pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.894564 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-operator-scripts\") pod \"cinder-5cea-account-create-update-mjzq9\" (UID: \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\") " pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:29 crc kubenswrapper[4702]: I1203 11:28:29.896586 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-operator-scripts\") pod \"cinder-5cea-account-create-update-mjzq9\" (UID: \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\") " pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.107624 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjrkg\" (UniqueName: \"kubernetes.io/projected/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-kube-api-access-fjrkg\") pod \"cinder-5cea-account-create-update-mjzq9\" (UID: \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\") " pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.114095 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-cae4-account-create-update-l5sfj"] Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.116048 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.135285 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.169215 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jm662"] Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.177227 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/654aa8a0-852c-4aae-b72d-0ca4eb991a77-operator-scripts\") pod \"heat-3858-account-create-update-58hz9\" (UID: \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\") " pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.177354 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88hdl\" (UniqueName: \"kubernetes.io/projected/654aa8a0-852c-4aae-b72d-0ca4eb991a77-kube-api-access-88hdl\") pod \"heat-3858-account-create-update-58hz9\" (UID: \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\") " pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.196902 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cae4-account-create-update-l5sfj"] Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.197069 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.201913 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnt45" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.202144 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.202383 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.214686 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.268269 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jm662"] Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.280055 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/654aa8a0-852c-4aae-b72d-0ca4eb991a77-operator-scripts\") pod \"heat-3858-account-create-update-58hz9\" (UID: \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\") " pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.280170 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccf575d-baf1-476f-bcba-41c45119c970-operator-scripts\") pod \"barbican-cae4-account-create-update-l5sfj\" (UID: \"7ccf575d-baf1-476f-bcba-41c45119c970\") " pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.280225 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88hdl\" (UniqueName: \"kubernetes.io/projected/654aa8a0-852c-4aae-b72d-0ca4eb991a77-kube-api-access-88hdl\") pod \"heat-3858-account-create-update-58hz9\" (UID: \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\") " pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.280271 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsm2r\" (UniqueName: \"kubernetes.io/projected/7ccf575d-baf1-476f-bcba-41c45119c970-kube-api-access-hsm2r\") pod \"barbican-cae4-account-create-update-l5sfj\" (UID: \"7ccf575d-baf1-476f-bcba-41c45119c970\") " pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.281105 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/654aa8a0-852c-4aae-b72d-0ca4eb991a77-operator-scripts\") pod \"heat-3858-account-create-update-58hz9\" (UID: \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\") " pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.303908 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88hdl\" (UniqueName: \"kubernetes.io/projected/654aa8a0-852c-4aae-b72d-0ca4eb991a77-kube-api-access-88hdl\") pod \"heat-3858-account-create-update-58hz9\" (UID: \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\") " pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.329229 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fv4ks"] Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.338804 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.344128 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fv4ks"] Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.354536 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-890e-account-create-update-krpq7"] Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.356330 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.359063 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.365621 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-890e-account-create-update-krpq7"] Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.382909 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-combined-ca-bundle\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.383006 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsm2r\" (UniqueName: \"kubernetes.io/projected/7ccf575d-baf1-476f-bcba-41c45119c970-kube-api-access-hsm2r\") pod \"barbican-cae4-account-create-update-l5sfj\" (UID: \"7ccf575d-baf1-476f-bcba-41c45119c970\") " pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.383260 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqf5m\" (UniqueName: \"kubernetes.io/projected/3e2598e5-2410-4a5b-b8b5-f41239c131d1-kube-api-access-rqf5m\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.383309 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccf575d-baf1-476f-bcba-41c45119c970-operator-scripts\") pod \"barbican-cae4-account-create-update-l5sfj\" (UID: \"7ccf575d-baf1-476f-bcba-41c45119c970\") " pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.383348 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-config-data\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.388698 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccf575d-baf1-476f-bcba-41c45119c970-operator-scripts\") pod \"barbican-cae4-account-create-update-l5sfj\" (UID: \"7ccf575d-baf1-476f-bcba-41c45119c970\") " pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.393098 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.423722 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsm2r\" (UniqueName: \"kubernetes.io/projected/7ccf575d-baf1-476f-bcba-41c45119c970-kube-api-access-hsm2r\") pod \"barbican-cae4-account-create-update-l5sfj\" (UID: \"7ccf575d-baf1-476f-bcba-41c45119c970\") " pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.485417 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqf5m\" (UniqueName: \"kubernetes.io/projected/3e2598e5-2410-4a5b-b8b5-f41239c131d1-kube-api-access-rqf5m\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.485493 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-config-data\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.485530 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-combined-ca-bundle\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.485665 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67ff48c-39fd-461b-8858-56bd0150cb46-operator-scripts\") pod \"neutron-db-create-fv4ks\" (UID: \"e67ff48c-39fd-461b-8858-56bd0150cb46\") " pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.485717 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpwpt\" (UniqueName: \"kubernetes.io/projected/e67ff48c-39fd-461b-8858-56bd0150cb46-kube-api-access-jpwpt\") pod \"neutron-db-create-fv4ks\" (UID: \"e67ff48c-39fd-461b-8858-56bd0150cb46\") " pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.485739 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a867bd18-48a7-462b-b4ed-6d103ccf80bf-operator-scripts\") pod \"neutron-890e-account-create-update-krpq7\" (UID: \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\") " pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.485792 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6524\" (UniqueName: \"kubernetes.io/projected/a867bd18-48a7-462b-b4ed-6d103ccf80bf-kube-api-access-t6524\") pod \"neutron-890e-account-create-update-krpq7\" (UID: \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\") " pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.490606 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-config-data\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.492214 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-combined-ca-bundle\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.499691 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.500340 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.506512 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqf5m\" (UniqueName: \"kubernetes.io/projected/3e2598e5-2410-4a5b-b8b5-f41239c131d1-kube-api-access-rqf5m\") pod \"keystone-db-sync-jm662\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.526309 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jm662" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.590470 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpwpt\" (UniqueName: \"kubernetes.io/projected/e67ff48c-39fd-461b-8858-56bd0150cb46-kube-api-access-jpwpt\") pod \"neutron-db-create-fv4ks\" (UID: \"e67ff48c-39fd-461b-8858-56bd0150cb46\") " pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.590927 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a867bd18-48a7-462b-b4ed-6d103ccf80bf-operator-scripts\") pod \"neutron-890e-account-create-update-krpq7\" (UID: \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\") " pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.590997 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6524\" (UniqueName: \"kubernetes.io/projected/a867bd18-48a7-462b-b4ed-6d103ccf80bf-kube-api-access-t6524\") pod \"neutron-890e-account-create-update-krpq7\" (UID: \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\") " pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.591234 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67ff48c-39fd-461b-8858-56bd0150cb46-operator-scripts\") pod \"neutron-db-create-fv4ks\" (UID: \"e67ff48c-39fd-461b-8858-56bd0150cb46\") " pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.591656 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a867bd18-48a7-462b-b4ed-6d103ccf80bf-operator-scripts\") pod \"neutron-890e-account-create-update-krpq7\" (UID: \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\") " pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.593924 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67ff48c-39fd-461b-8858-56bd0150cb46-operator-scripts\") pod \"neutron-db-create-fv4ks\" (UID: \"e67ff48c-39fd-461b-8858-56bd0150cb46\") " pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.614403 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpwpt\" (UniqueName: \"kubernetes.io/projected/e67ff48c-39fd-461b-8858-56bd0150cb46-kube-api-access-jpwpt\") pod \"neutron-db-create-fv4ks\" (UID: \"e67ff48c-39fd-461b-8858-56bd0150cb46\") " pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.615471 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6524\" (UniqueName: \"kubernetes.io/projected/a867bd18-48a7-462b-b4ed-6d103ccf80bf-kube-api-access-t6524\") pod \"neutron-890e-account-create-update-krpq7\" (UID: \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\") " pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.704062 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.727165 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:30 crc kubenswrapper[4702]: I1203 11:28:30.947646 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78118123-98b7-46ee-bd4c-21d79041a179" path="/var/lib/kubelet/pods/78118123-98b7-46ee-bd4c-21d79041a179/volumes" Dec 03 11:28:31 crc kubenswrapper[4702]: I1203 11:28:31.622030 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:31 crc kubenswrapper[4702]: I1203 11:28:31.622083 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:31 crc kubenswrapper[4702]: I1203 11:28:31.628440 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:31 crc kubenswrapper[4702]: I1203 11:28:31.808649 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.104051 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.104941 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="config-reloader" containerID="cri-o://61826dc676438e99394cad546f4a8ecb4ae28b3a7e111ba1949826bc217d5a6c" gracePeriod=600 Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.105016 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="thanos-sidecar" containerID="cri-o://88f3286e58a1dc879bcffc00fdf2a430841087a91991b7ba4a0d1818edda3fba" gracePeriod=600 Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.105065 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="prometheus" containerID="cri-o://b6ec0beee3d91f801d459872b27f814b8294b3c0226d5e80def62cd95e4b099b" gracePeriod=600 Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.865200 4702 generic.go:334] "Generic (PLEG): container finished" podID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerID="b6ec0beee3d91f801d459872b27f814b8294b3c0226d5e80def62cd95e4b099b" exitCode=0 Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.865255 4702 generic.go:334] "Generic (PLEG): container finished" podID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerID="88f3286e58a1dc879bcffc00fdf2a430841087a91991b7ba4a0d1818edda3fba" exitCode=0 Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.865268 4702 generic.go:334] "Generic (PLEG): container finished" podID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerID="61826dc676438e99394cad546f4a8ecb4ae28b3a7e111ba1949826bc217d5a6c" exitCode=0 Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.865279 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerDied","Data":"b6ec0beee3d91f801d459872b27f814b8294b3c0226d5e80def62cd95e4b099b"} Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.865350 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerDied","Data":"88f3286e58a1dc879bcffc00fdf2a430841087a91991b7ba4a0d1818edda3fba"} Dec 03 11:28:35 crc kubenswrapper[4702]: I1203 11:28:35.865365 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerDied","Data":"61826dc676438e99394cad546f4a8ecb4ae28b3a7e111ba1949826bc217d5a6c"} Dec 03 11:28:36 crc kubenswrapper[4702]: I1203 11:28:36.622624 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.135:9090/-/ready\": dial tcp 10.217.0.135:9090: connect: connection refused" Dec 03 11:28:41 crc kubenswrapper[4702]: E1203 11:28:41.361618 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 03 11:28:41 crc kubenswrapper[4702]: E1203 11:28:41.362409 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lhj4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-hq9rm_openstack(34d869ae-eae9-4bf6-b05e-cf40504ccdb6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:28:41 crc kubenswrapper[4702]: E1203 11:28:41.363911 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-hq9rm" podUID="34d869ae-eae9-4bf6-b05e-cf40504ccdb6" Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.458316 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.521885 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-operator-scripts\") pod \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\" (UID: \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\") " Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.522004 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxp8h\" (UniqueName: \"kubernetes.io/projected/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-kube-api-access-dxp8h\") pod \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\" (UID: \"e3b687a4-b074-4490-bb5a-c218ad3f6c9c\") " Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.523351 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3b687a4-b074-4490-bb5a-c218ad3f6c9c" (UID: "e3b687a4-b074-4490-bb5a-c218ad3f6c9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.529231 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-kube-api-access-dxp8h" (OuterVolumeSpecName: "kube-api-access-dxp8h") pod "e3b687a4-b074-4490-bb5a-c218ad3f6c9c" (UID: "e3b687a4-b074-4490-bb5a-c218ad3f6c9c"). InnerVolumeSpecName "kube-api-access-dxp8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.626160 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.626608 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxp8h\" (UniqueName: \"kubernetes.io/projected/e3b687a4-b074-4490-bb5a-c218ad3f6c9c-kube-api-access-dxp8h\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.956199 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" event={"ID":"e3b687a4-b074-4490-bb5a-c218ad3f6c9c","Type":"ContainerDied","Data":"71ce4a04133dc62e558e7651f800385f73b33a793c897905442160e766a4ec0e"} Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.956296 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71ce4a04133dc62e558e7651f800385f73b33a793c897905442160e766a4ec0e" Dec 03 11:28:41 crc kubenswrapper[4702]: I1203 11:28:41.956239 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b754-account-create-update-k7lzh" Dec 03 11:28:41 crc kubenswrapper[4702]: E1203 11:28:41.975384 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-hq9rm" podUID="34d869ae-eae9-4bf6-b05e-cf40504ccdb6" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.019738 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.173371 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.173521 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config-out\") pod \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.173602 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prkgf\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-kube-api-access-prkgf\") pod \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.173720 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-tls-assets\") pod \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.173822 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config\") pod \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.173900 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-web-config\") pod \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.173943 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-thanos-prometheus-http-client-file\") pod \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.174102 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0\") pod \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\" (UID: \"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9\") " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.176566 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.183220 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.183306 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.197238 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.197244 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config-out" (OuterVolumeSpecName: "config-out") pod "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.208906 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-kube-api-access-prkgf" (OuterVolumeSpecName: "kube-api-access-prkgf") pod "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9"). InnerVolumeSpecName "kube-api-access-prkgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.209268 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config" (OuterVolumeSpecName: "config") pod "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.271933 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-web-config" (OuterVolumeSpecName: "web-config") pod "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" (UID: "cf7bd44e-b3b4-4812-b7ee-512fb948d8f9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.281642 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.281680 4702 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-web-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.281693 4702 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.281704 4702 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.281746 4702 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.281776 4702 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-config-out\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.281789 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prkgf\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-kube-api-access-prkgf\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.281801 4702 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.308564 4702 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.383975 4702 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.451301 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-cae4-account-create-update-l5sfj"] Dec 03 11:28:42 crc kubenswrapper[4702]: W1203 11:28:42.465994 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ccf575d_baf1_476f_bcba_41c45119c970.slice/crio-f822bdef3b2d425d787ee8747ba04a6a91ec1ed3505497740d0cccf00738c516 WatchSource:0}: Error finding container f822bdef3b2d425d787ee8747ba04a6a91ec1ed3505497740d0cccf00738c516: Status 404 returned error can't find the container with id f822bdef3b2d425d787ee8747ba04a6a91ec1ed3505497740d0cccf00738c516 Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.477885 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.700593 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.701558 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-5cea-account-create-update-mjzq9"] Dec 03 11:28:42 crc kubenswrapper[4702]: W1203 11:28:42.752482 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e2598e5_2410_4a5b_b8b5_f41239c131d1.slice/crio-10991fccf96e269f74fa4b211930e178af14dc6f29e842ce5053de28d217db9b WatchSource:0}: Error finding container 10991fccf96e269f74fa4b211930e178af14dc6f29e842ce5053de28d217db9b: Status 404 returned error can't find the container with id 10991fccf96e269f74fa4b211930e178af14dc6f29e842ce5053de28d217db9b Dec 03 11:28:42 crc kubenswrapper[4702]: I1203 11:28:42.779688 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jm662"] Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.114356 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5cea-account-create-update-mjzq9" event={"ID":"5d7de17e-a277-4ec1-b0ca-d45be9fd054c","Type":"ContainerStarted","Data":"e79f7ae35213dc7137e11082f837cb0bc1634ec10f3bd46cf66321b216d773eb"} Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.126482 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"cf7bd44e-b3b4-4812-b7ee-512fb948d8f9","Type":"ContainerDied","Data":"851d0ed2dccf3a3c4e53f4003c6ba4bbfc6d4a81b381ad8923629bff8358ed09"} Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.126509 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.126570 4702 scope.go:117] "RemoveContainer" containerID="b6ec0beee3d91f801d459872b27f814b8294b3c0226d5e80def62cd95e4b099b" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.128110 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-nccll"] Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.130385 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cae4-account-create-update-l5sfj" event={"ID":"7ccf575d-baf1-476f-bcba-41c45119c970","Type":"ContainerStarted","Data":"f822bdef3b2d425d787ee8747ba04a6a91ec1ed3505497740d0cccf00738c516"} Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.134140 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jm662" event={"ID":"3e2598e5-2410-4a5b-b8b5-f41239c131d1","Type":"ContainerStarted","Data":"10991fccf96e269f74fa4b211930e178af14dc6f29e842ce5053de28d217db9b"} Dec 03 11:28:43 crc kubenswrapper[4702]: W1203 11:28:43.149231 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb0e1ee0_9045_4d23_85fb_1a79b18242c4.slice/crio-72b8f33e8ad2df81cda07f861b7a1685408c00030a1a14d3858a59a3f846e7cf WatchSource:0}: Error finding container 72b8f33e8ad2df81cda07f861b7a1685408c00030a1a14d3858a59a3f846e7cf: Status 404 returned error can't find the container with id 72b8f33e8ad2df81cda07f861b7a1685408c00030a1a14d3858a59a3f846e7cf Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.167807 4702 scope.go:117] "RemoveContainer" containerID="88f3286e58a1dc879bcffc00fdf2a430841087a91991b7ba4a0d1818edda3fba" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.191698 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.215804 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.249300 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:28:43 crc kubenswrapper[4702]: E1203 11:28:43.250172 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="prometheus" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250190 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="prometheus" Dec 03 11:28:43 crc kubenswrapper[4702]: E1203 11:28:43.250211 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b687a4-b074-4490-bb5a-c218ad3f6c9c" containerName="mariadb-account-create-update" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250218 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b687a4-b074-4490-bb5a-c218ad3f6c9c" containerName="mariadb-account-create-update" Dec 03 11:28:43 crc kubenswrapper[4702]: E1203 11:28:43.250247 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="thanos-sidecar" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250254 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="thanos-sidecar" Dec 03 11:28:43 crc kubenswrapper[4702]: E1203 11:28:43.250275 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="init-config-reloader" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250282 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="init-config-reloader" Dec 03 11:28:43 crc kubenswrapper[4702]: E1203 11:28:43.250306 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="config-reloader" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250312 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="config-reloader" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250557 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="config-reloader" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250584 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="prometheus" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250590 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b687a4-b074-4490-bb5a-c218ad3f6c9c" containerName="mariadb-account-create-update" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.250599 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="thanos-sidecar" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.255625 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.256254 4702 scope.go:117] "RemoveContainer" containerID="61826dc676438e99394cad546f4a8ecb4ae28b3a7e111ba1949826bc217d5a6c" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.261566 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-hlcn5" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.261992 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.262273 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.262446 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.262984 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.264135 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.274733 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.294454 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.344811 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90e9786f-3e0d-4a23-b624-b49a3d386784-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.346015 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.346234 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.346671 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90e9786f-3e0d-4a23-b624-b49a3d386784-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.346828 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.347077 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90e9786f-3e0d-4a23-b624-b49a3d386784-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.347212 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-config\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.347802 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.348016 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.348129 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.348232 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f4mb\" (UniqueName: \"kubernetes.io/projected/90e9786f-3e0d-4a23-b624-b49a3d386784-kube-api-access-9f4mb\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.396421 4702 scope.go:117] "RemoveContainer" containerID="0cdd14cd1c46fed0871c7e58f04bc827118bac9d8a9e97af12dfaff332296b46" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450037 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450117 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450196 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f4mb\" (UniqueName: \"kubernetes.io/projected/90e9786f-3e0d-4a23-b624-b49a3d386784-kube-api-access-9f4mb\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450266 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90e9786f-3e0d-4a23-b624-b49a3d386784-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450313 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450336 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450374 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90e9786f-3e0d-4a23-b624-b49a3d386784-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450400 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450486 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90e9786f-3e0d-4a23-b624-b49a3d386784-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450515 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-config\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.450667 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.453828 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.457289 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90e9786f-3e0d-4a23-b624-b49a3d386784-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.463352 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.484918 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90e9786f-3e0d-4a23-b624-b49a3d386784-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.485285 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.485686 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.486306 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-config\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.491816 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f4mb\" (UniqueName: \"kubernetes.io/projected/90e9786f-3e0d-4a23-b624-b49a3d386784-kube-api-access-9f4mb\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.492417 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.492553 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90e9786f-3e0d-4a23-b624-b49a3d386784-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.501521 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90e9786f-3e0d-4a23-b624-b49a3d386784-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.548274 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"prometheus-metric-storage-0\" (UID: \"90e9786f-3e0d-4a23-b624-b49a3d386784\") " pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:43 crc kubenswrapper[4702]: I1203 11:28:43.700878 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.047501 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-pdhc7"] Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.068072 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nd8sv"] Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.086135 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fv4ks"] Dec 03 11:28:44 crc kubenswrapper[4702]: W1203 11:28:44.126277 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode67ff48c_39fd_461b_8858_56bd0150cb46.slice/crio-2e99cb7deeefa80e50516d44f97f92e44b29552e8b16f7db75d1990130c6618b WatchSource:0}: Error finding container 2e99cb7deeefa80e50516d44f97f92e44b29552e8b16f7db75d1990130c6618b: Status 404 returned error can't find the container with id 2e99cb7deeefa80e50516d44f97f92e44b29552e8b16f7db75d1990130c6618b Dec 03 11:28:44 crc kubenswrapper[4702]: W1203 11:28:44.128279 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae1887ec_19fc_43a3_ab93_481f83e4a190.slice/crio-86b5b8b6c3150fbba3bcec14d99812f5c464fc09455e020fb27686c637e14c26 WatchSource:0}: Error finding container 86b5b8b6c3150fbba3bcec14d99812f5c464fc09455e020fb27686c637e14c26: Status 404 returned error can't find the container with id 86b5b8b6c3150fbba3bcec14d99812f5c464fc09455e020fb27686c637e14c26 Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.178905 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-890e-account-create-update-krpq7"] Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.191043 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pdhc7" event={"ID":"ae1887ec-19fc-43a3-ab93-481f83e4a190","Type":"ContainerStarted","Data":"86b5b8b6c3150fbba3bcec14d99812f5c464fc09455e020fb27686c637e14c26"} Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.198944 4702 generic.go:334] "Generic (PLEG): container finished" podID="7ccf575d-baf1-476f-bcba-41c45119c970" containerID="2ceca0a7b4c165f5de9d880d7f78c34a7795ffaac84447aaed12b758fbf597de" exitCode=0 Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.199103 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cae4-account-create-update-l5sfj" event={"ID":"7ccf575d-baf1-476f-bcba-41c45119c970","Type":"ContainerDied","Data":"2ceca0a7b4c165f5de9d880d7f78c34a7795ffaac84447aaed12b758fbf597de"} Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.204726 4702 generic.go:334] "Generic (PLEG): container finished" podID="5d7de17e-a277-4ec1-b0ca-d45be9fd054c" containerID="a39d34c4bd22f9121f011972d8b465ac5d6f4594708a6881637d54378c8fb867" exitCode=0 Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.204985 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5cea-account-create-update-mjzq9" event={"ID":"5d7de17e-a277-4ec1-b0ca-d45be9fd054c","Type":"ContainerDied","Data":"a39d34c4bd22f9121f011972d8b465ac5d6f4594708a6881637d54378c8fb867"} Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.214840 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.217301 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-3858-account-create-update-58hz9"] Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.224252 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nccll" event={"ID":"eb0e1ee0-9045-4d23-85fb-1a79b18242c4","Type":"ContainerStarted","Data":"1f9b7194d4479966285d927fb3db55ba0cb1f08ea30d8b5744c05eca65449218"} Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.224323 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nccll" event={"ID":"eb0e1ee0-9045-4d23-85fb-1a79b18242c4","Type":"ContainerStarted","Data":"72b8f33e8ad2df81cda07f861b7a1685408c00030a1a14d3858a59a3f846e7cf"} Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.232713 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fv4ks" event={"ID":"e67ff48c-39fd-461b-8858-56bd0150cb46","Type":"ContainerStarted","Data":"2e99cb7deeefa80e50516d44f97f92e44b29552e8b16f7db75d1990130c6618b"} Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.234034 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nd8sv" event={"ID":"400f59d1-8b01-45e9-84b0-173a5fa761d5","Type":"ContainerStarted","Data":"cff937c30923dd2bd334c22b107fba533cd7f048d74543a6334344ff36e8df73"} Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.277725 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5v7lw-config-bt5gr"] Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.335660 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.336171 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-nccll" podStartSLOduration=15.336141004 podStartE2EDuration="15.336141004s" podCreationTimestamp="2025-12-03 11:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:44.283557445 +0000 UTC m=+1508.119485959" watchObservedRunningTime="2025-12-03 11:28:44.336141004 +0000 UTC m=+1508.172069468" Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.629031 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.135:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.679401 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 11:28:44 crc kubenswrapper[4702]: I1203 11:28:44.972933 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7bd44e-b3b4-4812-b7ee-512fb948d8f9" path="/var/lib/kubelet/pods/cf7bd44e-b3b4-4812-b7ee-512fb948d8f9/volumes" Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.254274 4702 generic.go:334] "Generic (PLEG): container finished" podID="ae1887ec-19fc-43a3-ab93-481f83e4a190" containerID="d98a7e18305ac71f1b26770879f29b0af16f063edbd6f7bd7d197691baaec754" exitCode=0 Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.254488 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pdhc7" event={"ID":"ae1887ec-19fc-43a3-ab93-481f83e4a190","Type":"ContainerDied","Data":"d98a7e18305ac71f1b26770879f29b0af16f063edbd6f7bd7d197691baaec754"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.257796 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-890e-account-create-update-krpq7" event={"ID":"a867bd18-48a7-462b-b4ed-6d103ccf80bf","Type":"ContainerStarted","Data":"7aaa90fc8aa5bb535d369a89af9f3c4141012d2247ec20ec2a2986df0de06c93"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.257833 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-890e-account-create-update-krpq7" event={"ID":"a867bd18-48a7-462b-b4ed-6d103ccf80bf","Type":"ContainerStarted","Data":"135d95be0403527d3205796f6e01209767b4371c398955fcda47f83887b96e11"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.261075 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"90e9786f-3e0d-4a23-b624-b49a3d386784","Type":"ContainerStarted","Data":"76da2d2bb9f2568b03450c5588a471fd91bd6f6dc9346b2bf602b892f8efbd69"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.262901 4702 generic.go:334] "Generic (PLEG): container finished" podID="eb0e1ee0-9045-4d23-85fb-1a79b18242c4" containerID="1f9b7194d4479966285d927fb3db55ba0cb1f08ea30d8b5744c05eca65449218" exitCode=0 Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.263560 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nccll" event={"ID":"eb0e1ee0-9045-4d23-85fb-1a79b18242c4","Type":"ContainerDied","Data":"1f9b7194d4479966285d927fb3db55ba0cb1f08ea30d8b5744c05eca65449218"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.271374 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3858-account-create-update-58hz9" event={"ID":"654aa8a0-852c-4aae-b72d-0ca4eb991a77","Type":"ContainerStarted","Data":"092848727cd1175a91b12c135ef3464d9b083cb15ee1605161dace487d55e8d8"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.271460 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3858-account-create-update-58hz9" event={"ID":"654aa8a0-852c-4aae-b72d-0ca4eb991a77","Type":"ContainerStarted","Data":"21548799a071548ecdfbeb3ef2e679bf06179064b11a9e9d72f13d8a010a00e3"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.278125 4702 generic.go:334] "Generic (PLEG): container finished" podID="e67ff48c-39fd-461b-8858-56bd0150cb46" containerID="03736fd66d9315f20e39020018954b4db7f94210143895c30e278d4b86796337" exitCode=0 Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.278283 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fv4ks" event={"ID":"e67ff48c-39fd-461b-8858-56bd0150cb46","Type":"ContainerDied","Data":"03736fd66d9315f20e39020018954b4db7f94210143895c30e278d4b86796337"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.287994 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-bt5gr" event={"ID":"583afa7e-d99c-4768-af3b-018884bde7a9","Type":"ContainerStarted","Data":"8a4ee33f948589467c0af31fc62697afcbea2ea853b560f7e022894353a8c1c2"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.288071 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-bt5gr" event={"ID":"583afa7e-d99c-4768-af3b-018884bde7a9","Type":"ContainerStarted","Data":"3ca58de02f7811c535849464902d405cb8bcd20c7aad9db9671ff4f9cdcc7348"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.301358 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nd8sv" event={"ID":"400f59d1-8b01-45e9-84b0-173a5fa761d5","Type":"ContainerStarted","Data":"4889216c64cf85d1cc219dc1dd0cd7afeda3c3f919c0c9682402f61aa13bff66"} Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.339426 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-3858-account-create-update-58hz9" podStartSLOduration=16.339395041 podStartE2EDuration="16.339395041s" podCreationTimestamp="2025-12-03 11:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:45.33724903 +0000 UTC m=+1509.173177504" watchObservedRunningTime="2025-12-03 11:28:45.339395041 +0000 UTC m=+1509.175323505" Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.383383 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-890e-account-create-update-krpq7" podStartSLOduration=15.383345056 podStartE2EDuration="15.383345056s" podCreationTimestamp="2025-12-03 11:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:45.359186961 +0000 UTC m=+1509.195115435" watchObservedRunningTime="2025-12-03 11:28:45.383345056 +0000 UTC m=+1509.219273520" Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.395421 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5v7lw-config-bt5gr" podStartSLOduration=16.395389257 podStartE2EDuration="16.395389257s" podCreationTimestamp="2025-12-03 11:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:45.376965325 +0000 UTC m=+1509.212893789" watchObservedRunningTime="2025-12-03 11:28:45.395389257 +0000 UTC m=+1509.231317721" Dec 03 11:28:45 crc kubenswrapper[4702]: I1203 11:28:45.415935 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-nd8sv" podStartSLOduration=16.415905098 podStartE2EDuration="16.415905098s" podCreationTimestamp="2025-12-03 11:28:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:45.404261208 +0000 UTC m=+1509.240189702" watchObservedRunningTime="2025-12-03 11:28:45.415905098 +0000 UTC m=+1509.251833562" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.008111 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.091785 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjrkg\" (UniqueName: \"kubernetes.io/projected/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-kube-api-access-fjrkg\") pod \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\" (UID: \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\") " Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.092007 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-operator-scripts\") pod \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\" (UID: \"5d7de17e-a277-4ec1-b0ca-d45be9fd054c\") " Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.095363 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d7de17e-a277-4ec1-b0ca-d45be9fd054c" (UID: "5d7de17e-a277-4ec1-b0ca-d45be9fd054c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.108191 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-kube-api-access-fjrkg" (OuterVolumeSpecName: "kube-api-access-fjrkg") pod "5d7de17e-a277-4ec1-b0ca-d45be9fd054c" (UID: "5d7de17e-a277-4ec1-b0ca-d45be9fd054c"). InnerVolumeSpecName "kube-api-access-fjrkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.191884 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:28:46 crc kubenswrapper[4702]: E1203 11:28:46.192879 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d7de17e-a277-4ec1-b0ca-d45be9fd054c" containerName="mariadb-account-create-update" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.192910 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d7de17e-a277-4ec1-b0ca-d45be9fd054c" containerName="mariadb-account-create-update" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.193274 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d7de17e-a277-4ec1-b0ca-d45be9fd054c" containerName="mariadb-account-create-update" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.194841 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.196719 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjrkg\" (UniqueName: \"kubernetes.io/projected/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-kube-api-access-fjrkg\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.196786 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d7de17e-a277-4ec1-b0ca-d45be9fd054c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.211332 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.247547 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.253892 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.298909 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.299019 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-config-data\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.299065 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd7jp\" (UniqueName: \"kubernetes.io/projected/7b6e2ad9-6424-4857-8dc6-a67f7758151d-kube-api-access-wd7jp\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.338296 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-5cea-account-create-update-mjzq9" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.340012 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-5cea-account-create-update-mjzq9" event={"ID":"5d7de17e-a277-4ec1-b0ca-d45be9fd054c","Type":"ContainerDied","Data":"e79f7ae35213dc7137e11082f837cb0bc1634ec10f3bd46cf66321b216d773eb"} Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.340074 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e79f7ae35213dc7137e11082f837cb0bc1634ec10f3bd46cf66321b216d773eb" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.355852 4702 generic.go:334] "Generic (PLEG): container finished" podID="654aa8a0-852c-4aae-b72d-0ca4eb991a77" containerID="092848727cd1175a91b12c135ef3464d9b083cb15ee1605161dace487d55e8d8" exitCode=0 Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.355981 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3858-account-create-update-58hz9" event={"ID":"654aa8a0-852c-4aae-b72d-0ca4eb991a77","Type":"ContainerDied","Data":"092848727cd1175a91b12c135ef3464d9b083cb15ee1605161dace487d55e8d8"} Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.359156 4702 generic.go:334] "Generic (PLEG): container finished" podID="583afa7e-d99c-4768-af3b-018884bde7a9" containerID="8a4ee33f948589467c0af31fc62697afcbea2ea853b560f7e022894353a8c1c2" exitCode=0 Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.359211 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-bt5gr" event={"ID":"583afa7e-d99c-4768-af3b-018884bde7a9","Type":"ContainerDied","Data":"8a4ee33f948589467c0af31fc62697afcbea2ea853b560f7e022894353a8c1c2"} Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.361846 4702 generic.go:334] "Generic (PLEG): container finished" podID="400f59d1-8b01-45e9-84b0-173a5fa761d5" containerID="4889216c64cf85d1cc219dc1dd0cd7afeda3c3f919c0c9682402f61aa13bff66" exitCode=0 Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.361913 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nd8sv" event={"ID":"400f59d1-8b01-45e9-84b0-173a5fa761d5","Type":"ContainerDied","Data":"4889216c64cf85d1cc219dc1dd0cd7afeda3c3f919c0c9682402f61aa13bff66"} Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.363964 4702 generic.go:334] "Generic (PLEG): container finished" podID="a867bd18-48a7-462b-b4ed-6d103ccf80bf" containerID="7aaa90fc8aa5bb535d369a89af9f3c4141012d2247ec20ec2a2986df0de06c93" exitCode=0 Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.364182 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-890e-account-create-update-krpq7" event={"ID":"a867bd18-48a7-462b-b4ed-6d103ccf80bf","Type":"ContainerDied","Data":"7aaa90fc8aa5bb535d369a89af9f3c4141012d2247ec20ec2a2986df0de06c93"} Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.369642 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-cae4-account-create-update-l5sfj" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.370418 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-cae4-account-create-update-l5sfj" event={"ID":"7ccf575d-baf1-476f-bcba-41c45119c970","Type":"ContainerDied","Data":"f822bdef3b2d425d787ee8747ba04a6a91ec1ed3505497740d0cccf00738c516"} Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.370504 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f822bdef3b2d425d787ee8747ba04a6a91ec1ed3505497740d0cccf00738c516" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.408642 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsm2r\" (UniqueName: \"kubernetes.io/projected/7ccf575d-baf1-476f-bcba-41c45119c970-kube-api-access-hsm2r\") pod \"7ccf575d-baf1-476f-bcba-41c45119c970\" (UID: \"7ccf575d-baf1-476f-bcba-41c45119c970\") " Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.408869 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccf575d-baf1-476f-bcba-41c45119c970-operator-scripts\") pod \"7ccf575d-baf1-476f-bcba-41c45119c970\" (UID: \"7ccf575d-baf1-476f-bcba-41c45119c970\") " Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.410702 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd7jp\" (UniqueName: \"kubernetes.io/projected/7b6e2ad9-6424-4857-8dc6-a67f7758151d-kube-api-access-wd7jp\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.413334 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ccf575d-baf1-476f-bcba-41c45119c970-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ccf575d-baf1-476f-bcba-41c45119c970" (UID: "7ccf575d-baf1-476f-bcba-41c45119c970"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.422279 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.422596 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-config-data\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.425406 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ccf575d-baf1-476f-bcba-41c45119c970-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.435207 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ccf575d-baf1-476f-bcba-41c45119c970-kube-api-access-hsm2r" (OuterVolumeSpecName: "kube-api-access-hsm2r") pod "7ccf575d-baf1-476f-bcba-41c45119c970" (UID: "7ccf575d-baf1-476f-bcba-41c45119c970"). InnerVolumeSpecName "kube-api-access-hsm2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.437329 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.437555 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-config-data\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.457652 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd7jp\" (UniqueName: \"kubernetes.io/projected/7b6e2ad9-6424-4857-8dc6-a67f7758151d-kube-api-access-wd7jp\") pod \"mysqld-exporter-0\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.536498 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsm2r\" (UniqueName: \"kubernetes.io/projected/7ccf575d-baf1-476f-bcba-41c45119c970-kube-api-access-hsm2r\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:46 crc kubenswrapper[4702]: I1203 11:28:46.567716 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.079037 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nccll" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.260281 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb0e1ee0-9045-4d23-85fb-1a79b18242c4" (UID: "eb0e1ee0-9045-4d23-85fb-1a79b18242c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.259475 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-operator-scripts\") pod \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\" (UID: \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\") " Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.260475 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kls48\" (UniqueName: \"kubernetes.io/projected/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-kube-api-access-kls48\") pod \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\" (UID: \"eb0e1ee0-9045-4d23-85fb-1a79b18242c4\") " Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.261412 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.337197 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-kube-api-access-kls48" (OuterVolumeSpecName: "kube-api-access-kls48") pod "eb0e1ee0-9045-4d23-85fb-1a79b18242c4" (UID: "eb0e1ee0-9045-4d23-85fb-1a79b18242c4"). InnerVolumeSpecName "kube-api-access-kls48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.370237 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kls48\" (UniqueName: \"kubernetes.io/projected/eb0e1ee0-9045-4d23-85fb-1a79b18242c4-kube-api-access-kls48\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.421843 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.428068 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.455073 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-nccll" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.456291 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-nccll" event={"ID":"eb0e1ee0-9045-4d23-85fb-1a79b18242c4","Type":"ContainerDied","Data":"72b8f33e8ad2df81cda07f861b7a1685408c00030a1a14d3858a59a3f846e7cf"} Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.456369 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72b8f33e8ad2df81cda07f861b7a1685408c00030a1a14d3858a59a3f846e7cf" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.489996 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-pdhc7" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.490552 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-pdhc7" event={"ID":"ae1887ec-19fc-43a3-ab93-481f83e4a190","Type":"ContainerDied","Data":"86b5b8b6c3150fbba3bcec14d99812f5c464fc09455e020fb27686c637e14c26"} Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.490660 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86b5b8b6c3150fbba3bcec14d99812f5c464fc09455e020fb27686c637e14c26" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.580746 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae1887ec-19fc-43a3-ab93-481f83e4a190-operator-scripts\") pod \"ae1887ec-19fc-43a3-ab93-481f83e4a190\" (UID: \"ae1887ec-19fc-43a3-ab93-481f83e4a190\") " Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.580922 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67ff48c-39fd-461b-8858-56bd0150cb46-operator-scripts\") pod \"e67ff48c-39fd-461b-8858-56bd0150cb46\" (UID: \"e67ff48c-39fd-461b-8858-56bd0150cb46\") " Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.580976 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slbbp\" (UniqueName: \"kubernetes.io/projected/ae1887ec-19fc-43a3-ab93-481f83e4a190-kube-api-access-slbbp\") pod \"ae1887ec-19fc-43a3-ab93-481f83e4a190\" (UID: \"ae1887ec-19fc-43a3-ab93-481f83e4a190\") " Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.581254 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpwpt\" (UniqueName: \"kubernetes.io/projected/e67ff48c-39fd-461b-8858-56bd0150cb46-kube-api-access-jpwpt\") pod \"e67ff48c-39fd-461b-8858-56bd0150cb46\" (UID: \"e67ff48c-39fd-461b-8858-56bd0150cb46\") " Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.583769 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67ff48c-39fd-461b-8858-56bd0150cb46-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e67ff48c-39fd-461b-8858-56bd0150cb46" (UID: "e67ff48c-39fd-461b-8858-56bd0150cb46"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.585217 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae1887ec-19fc-43a3-ab93-481f83e4a190-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae1887ec-19fc-43a3-ab93-481f83e4a190" (UID: "ae1887ec-19fc-43a3-ab93-481f83e4a190"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.604567 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.631034 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67ff48c-39fd-461b-8858-56bd0150cb46-kube-api-access-jpwpt" (OuterVolumeSpecName: "kube-api-access-jpwpt") pod "e67ff48c-39fd-461b-8858-56bd0150cb46" (UID: "e67ff48c-39fd-461b-8858-56bd0150cb46"). InnerVolumeSpecName "kube-api-access-jpwpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.637926 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae1887ec-19fc-43a3-ab93-481f83e4a190-kube-api-access-slbbp" (OuterVolumeSpecName: "kube-api-access-slbbp") pod "ae1887ec-19fc-43a3-ab93-481f83e4a190" (UID: "ae1887ec-19fc-43a3-ab93-481f83e4a190"). InnerVolumeSpecName "kube-api-access-slbbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.677386 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v4l92"] Dec 03 11:28:47 crc kubenswrapper[4702]: E1203 11:28:47.678262 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ccf575d-baf1-476f-bcba-41c45119c970" containerName="mariadb-account-create-update" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.678282 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ccf575d-baf1-476f-bcba-41c45119c970" containerName="mariadb-account-create-update" Dec 03 11:28:47 crc kubenswrapper[4702]: E1203 11:28:47.678326 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67ff48c-39fd-461b-8858-56bd0150cb46" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.678336 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67ff48c-39fd-461b-8858-56bd0150cb46" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: E1203 11:28:47.678364 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae1887ec-19fc-43a3-ab93-481f83e4a190" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.678372 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae1887ec-19fc-43a3-ab93-481f83e4a190" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: E1203 11:28:47.678413 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0e1ee0-9045-4d23-85fb-1a79b18242c4" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.678422 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0e1ee0-9045-4d23-85fb-1a79b18242c4" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.678877 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0e1ee0-9045-4d23-85fb-1a79b18242c4" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.678908 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae1887ec-19fc-43a3-ab93-481f83e4a190" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.678939 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67ff48c-39fd-461b-8858-56bd0150cb46" containerName="mariadb-database-create" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.678962 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ccf575d-baf1-476f-bcba-41c45119c970" containerName="mariadb-account-create-update" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.681854 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.684876 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpwpt\" (UniqueName: \"kubernetes.io/projected/e67ff48c-39fd-461b-8858-56bd0150cb46-kube-api-access-jpwpt\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.684935 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae1887ec-19fc-43a3-ab93-481f83e4a190-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.684951 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e67ff48c-39fd-461b-8858-56bd0150cb46-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.684962 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slbbp\" (UniqueName: \"kubernetes.io/projected/ae1887ec-19fc-43a3-ab93-481f83e4a190-kube-api-access-slbbp\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.715856 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v4l92"] Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.786880 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-utilities\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.787061 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-catalog-content\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.787116 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6brr\" (UniqueName: \"kubernetes.io/projected/b14ee971-d6bc-478a-982a-998bce6d15a1-kube-api-access-q6brr\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.890435 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-utilities\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.890610 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-catalog-content\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.890654 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6brr\" (UniqueName: \"kubernetes.io/projected/b14ee971-d6bc-478a-982a-998bce6d15a1-kube-api-access-q6brr\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.891656 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-catalog-content\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.891692 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-utilities\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:47 crc kubenswrapper[4702]: I1203 11:28:47.927909 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6brr\" (UniqueName: \"kubernetes.io/projected/b14ee971-d6bc-478a-982a-998bce6d15a1-kube-api-access-q6brr\") pod \"community-operators-v4l92\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.113494 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.231163 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.313360 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxplh\" (UniqueName: \"kubernetes.io/projected/400f59d1-8b01-45e9-84b0-173a5fa761d5-kube-api-access-mxplh\") pod \"400f59d1-8b01-45e9-84b0-173a5fa761d5\" (UID: \"400f59d1-8b01-45e9-84b0-173a5fa761d5\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.314297 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400f59d1-8b01-45e9-84b0-173a5fa761d5-operator-scripts\") pod \"400f59d1-8b01-45e9-84b0-173a5fa761d5\" (UID: \"400f59d1-8b01-45e9-84b0-173a5fa761d5\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.318719 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400f59d1-8b01-45e9-84b0-173a5fa761d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "400f59d1-8b01-45e9-84b0-173a5fa761d5" (UID: "400f59d1-8b01-45e9-84b0-173a5fa761d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.356488 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/400f59d1-8b01-45e9-84b0-173a5fa761d5-kube-api-access-mxplh" (OuterVolumeSpecName: "kube-api-access-mxplh") pod "400f59d1-8b01-45e9-84b0-173a5fa761d5" (UID: "400f59d1-8b01-45e9-84b0-173a5fa761d5"). InnerVolumeSpecName "kube-api-access-mxplh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.440386 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxplh\" (UniqueName: \"kubernetes.io/projected/400f59d1-8b01-45e9-84b0-173a5fa761d5-kube-api-access-mxplh\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.440433 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/400f59d1-8b01-45e9-84b0-173a5fa761d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.636992 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fv4ks" event={"ID":"e67ff48c-39fd-461b-8858-56bd0150cb46","Type":"ContainerDied","Data":"2e99cb7deeefa80e50516d44f97f92e44b29552e8b16f7db75d1990130c6618b"} Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.637443 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e99cb7deeefa80e50516d44f97f92e44b29552e8b16f7db75d1990130c6618b" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.637558 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fv4ks" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.650698 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nd8sv" event={"ID":"400f59d1-8b01-45e9-84b0-173a5fa761d5","Type":"ContainerDied","Data":"cff937c30923dd2bd334c22b107fba533cd7f048d74543a6334344ff36e8df73"} Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.650775 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cff937c30923dd2bd334c22b107fba533cd7f048d74543a6334344ff36e8df73" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.650965 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nd8sv" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.689342 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7b6e2ad9-6424-4857-8dc6-a67f7758151d","Type":"ContainerStarted","Data":"9fbe706efb965354fb4447398a77d679fdc2f5b4ae35dccea5ea498823352030"} Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.916831 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.936494 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.984019 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.994611 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a867bd18-48a7-462b-b4ed-6d103ccf80bf-operator-scripts\") pod \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\" (UID: \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.994654 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-additional-scripts\") pod \"583afa7e-d99c-4768-af3b-018884bde7a9\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.994782 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6524\" (UniqueName: \"kubernetes.io/projected/a867bd18-48a7-462b-b4ed-6d103ccf80bf-kube-api-access-t6524\") pod \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\" (UID: \"a867bd18-48a7-462b-b4ed-6d103ccf80bf\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.994820 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-log-ovn\") pod \"583afa7e-d99c-4768-af3b-018884bde7a9\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.994886 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5qbn\" (UniqueName: \"kubernetes.io/projected/583afa7e-d99c-4768-af3b-018884bde7a9-kube-api-access-p5qbn\") pod \"583afa7e-d99c-4768-af3b-018884bde7a9\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.994954 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-scripts\") pod \"583afa7e-d99c-4768-af3b-018884bde7a9\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.995002 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run\") pod \"583afa7e-d99c-4768-af3b-018884bde7a9\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.995044 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run-ovn\") pod \"583afa7e-d99c-4768-af3b-018884bde7a9\" (UID: \"583afa7e-d99c-4768-af3b-018884bde7a9\") " Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.996926 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a867bd18-48a7-462b-b4ed-6d103ccf80bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a867bd18-48a7-462b-b4ed-6d103ccf80bf" (UID: "a867bd18-48a7-462b-b4ed-6d103ccf80bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.997433 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "583afa7e-d99c-4768-af3b-018884bde7a9" (UID: "583afa7e-d99c-4768-af3b-018884bde7a9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.997967 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "583afa7e-d99c-4768-af3b-018884bde7a9" (UID: "583afa7e-d99c-4768-af3b-018884bde7a9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:48 crc kubenswrapper[4702]: I1203 11:28:48.998046 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "583afa7e-d99c-4768-af3b-018884bde7a9" (UID: "583afa7e-d99c-4768-af3b-018884bde7a9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.002469 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run" (OuterVolumeSpecName: "var-run") pod "583afa7e-d99c-4768-af3b-018884bde7a9" (UID: "583afa7e-d99c-4768-af3b-018884bde7a9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.004131 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-scripts" (OuterVolumeSpecName: "scripts") pod "583afa7e-d99c-4768-af3b-018884bde7a9" (UID: "583afa7e-d99c-4768-af3b-018884bde7a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.017101 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a867bd18-48a7-462b-b4ed-6d103ccf80bf-kube-api-access-t6524" (OuterVolumeSpecName: "kube-api-access-t6524") pod "a867bd18-48a7-462b-b4ed-6d103ccf80bf" (UID: "a867bd18-48a7-462b-b4ed-6d103ccf80bf"). InnerVolumeSpecName "kube-api-access-t6524". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.042208 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/583afa7e-d99c-4768-af3b-018884bde7a9-kube-api-access-p5qbn" (OuterVolumeSpecName: "kube-api-access-p5qbn") pod "583afa7e-d99c-4768-af3b-018884bde7a9" (UID: "583afa7e-d99c-4768-af3b-018884bde7a9"). InnerVolumeSpecName "kube-api-access-p5qbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.097473 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/654aa8a0-852c-4aae-b72d-0ca4eb991a77-operator-scripts\") pod \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\" (UID: \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\") " Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.097628 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88hdl\" (UniqueName: \"kubernetes.io/projected/654aa8a0-852c-4aae-b72d-0ca4eb991a77-kube-api-access-88hdl\") pod \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\" (UID: \"654aa8a0-852c-4aae-b72d-0ca4eb991a77\") " Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.098266 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5qbn\" (UniqueName: \"kubernetes.io/projected/583afa7e-d99c-4768-af3b-018884bde7a9-kube-api-access-p5qbn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.098287 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.098299 4702 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.098311 4702 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.098326 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a867bd18-48a7-462b-b4ed-6d103ccf80bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.098337 4702 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/583afa7e-d99c-4768-af3b-018884bde7a9-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.098349 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6524\" (UniqueName: \"kubernetes.io/projected/a867bd18-48a7-462b-b4ed-6d103ccf80bf-kube-api-access-t6524\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.098361 4702 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/583afa7e-d99c-4768-af3b-018884bde7a9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.099729 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/654aa8a0-852c-4aae-b72d-0ca4eb991a77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "654aa8a0-852c-4aae-b72d-0ca4eb991a77" (UID: "654aa8a0-852c-4aae-b72d-0ca4eb991a77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.115219 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/654aa8a0-852c-4aae-b72d-0ca4eb991a77-kube-api-access-88hdl" (OuterVolumeSpecName: "kube-api-access-88hdl") pod "654aa8a0-852c-4aae-b72d-0ca4eb991a77" (UID: "654aa8a0-852c-4aae-b72d-0ca4eb991a77"). InnerVolumeSpecName "kube-api-access-88hdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.185493 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v4l92"] Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.201509 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88hdl\" (UniqueName: \"kubernetes.io/projected/654aa8a0-852c-4aae-b72d-0ca4eb991a77-kube-api-access-88hdl\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.201560 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/654aa8a0-852c-4aae-b72d-0ca4eb991a77-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:49 crc kubenswrapper[4702]: W1203 11:28:49.334507 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb14ee971_d6bc_478a_982a_998bce6d15a1.slice/crio-bdd6afc006ffa33ee84de530bef86fe73c0f957f322990c72207ad2c7347a578 WatchSource:0}: Error finding container bdd6afc006ffa33ee84de530bef86fe73c0f957f322990c72207ad2c7347a578: Status 404 returned error can't find the container with id bdd6afc006ffa33ee84de530bef86fe73c0f957f322990c72207ad2c7347a578 Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.820118 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4l92" event={"ID":"b14ee971-d6bc-478a-982a-998bce6d15a1","Type":"ContainerStarted","Data":"bdd6afc006ffa33ee84de530bef86fe73c0f957f322990c72207ad2c7347a578"} Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.824085 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-3858-account-create-update-58hz9" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.826441 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-3858-account-create-update-58hz9" event={"ID":"654aa8a0-852c-4aae-b72d-0ca4eb991a77","Type":"ContainerDied","Data":"21548799a071548ecdfbeb3ef2e679bf06179064b11a9e9d72f13d8a010a00e3"} Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.826530 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21548799a071548ecdfbeb3ef2e679bf06179064b11a9e9d72f13d8a010a00e3" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.829824 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-bt5gr" event={"ID":"583afa7e-d99c-4768-af3b-018884bde7a9","Type":"ContainerDied","Data":"3ca58de02f7811c535849464902d405cb8bcd20c7aad9db9671ff4f9cdcc7348"} Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.829876 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca58de02f7811c535849464902d405cb8bcd20c7aad9db9671ff4f9cdcc7348" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.831308 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-bt5gr" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.831411 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-890e-account-create-update-krpq7" event={"ID":"a867bd18-48a7-462b-b4ed-6d103ccf80bf","Type":"ContainerDied","Data":"135d95be0403527d3205796f6e01209767b4371c398955fcda47f83887b96e11"} Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.831466 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135d95be0403527d3205796f6e01209767b4371c398955fcda47f83887b96e11" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.831978 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-890e-account-create-update-krpq7" Dec 03 11:28:49 crc kubenswrapper[4702]: I1203 11:28:49.833898 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"90e9786f-3e0d-4a23-b624-b49a3d386784","Type":"ContainerStarted","Data":"22e4f4007ed632f0fe64347e59a4a025f3dbcd6bd9009a5c5e5e342e00b04022"} Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.049866 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5v7lw-config-bt5gr"] Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.071673 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5v7lw-config-bt5gr"] Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.087858 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5v7lw-config-km7bx"] Dec 03 11:28:50 crc kubenswrapper[4702]: E1203 11:28:50.088482 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a867bd18-48a7-462b-b4ed-6d103ccf80bf" containerName="mariadb-account-create-update" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.088506 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a867bd18-48a7-462b-b4ed-6d103ccf80bf" containerName="mariadb-account-create-update" Dec 03 11:28:50 crc kubenswrapper[4702]: E1203 11:28:50.088559 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="583afa7e-d99c-4768-af3b-018884bde7a9" containerName="ovn-config" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.088566 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="583afa7e-d99c-4768-af3b-018884bde7a9" containerName="ovn-config" Dec 03 11:28:50 crc kubenswrapper[4702]: E1203 11:28:50.091936 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="654aa8a0-852c-4aae-b72d-0ca4eb991a77" containerName="mariadb-account-create-update" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.091964 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="654aa8a0-852c-4aae-b72d-0ca4eb991a77" containerName="mariadb-account-create-update" Dec 03 11:28:50 crc kubenswrapper[4702]: E1203 11:28:50.091986 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400f59d1-8b01-45e9-84b0-173a5fa761d5" containerName="mariadb-database-create" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.091996 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="400f59d1-8b01-45e9-84b0-173a5fa761d5" containerName="mariadb-database-create" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.092352 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="400f59d1-8b01-45e9-84b0-173a5fa761d5" containerName="mariadb-database-create" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.092391 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a867bd18-48a7-462b-b4ed-6d103ccf80bf" containerName="mariadb-account-create-update" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.092413 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="654aa8a0-852c-4aae-b72d-0ca4eb991a77" containerName="mariadb-account-create-update" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.092431 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="583afa7e-d99c-4768-af3b-018884bde7a9" containerName="ovn-config" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.093600 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.098492 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.105497 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.118095 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5v7lw-config-km7bx"] Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.123787 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/3892571c-86ff-4259-beaa-6033dcfda204-etc-swift\") pod \"swift-storage-0\" (UID: \"3892571c-86ff-4259-beaa-6033dcfda204\") " pod="openstack/swift-storage-0" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.208315 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-log-ovn\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.209228 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run-ovn\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.209483 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2m8f\" (UniqueName: \"kubernetes.io/projected/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-kube-api-access-p2m8f\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.209736 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-additional-scripts\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.210034 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.210383 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-scripts\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.217630 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.319451 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-scripts\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.319665 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-log-ovn\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.319718 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run-ovn\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.319826 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2m8f\" (UniqueName: \"kubernetes.io/projected/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-kube-api-access-p2m8f\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.319897 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-additional-scripts\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.319943 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.320346 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run-ovn\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.320345 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.320442 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-log-ovn\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.322263 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-additional-scripts\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.336409 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-scripts\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.341982 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2m8f\" (UniqueName: \"kubernetes.io/projected/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-kube-api-access-p2m8f\") pod \"ovn-controller-5v7lw-config-km7bx\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.492950 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:50 crc kubenswrapper[4702]: I1203 11:28:50.950159 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="583afa7e-d99c-4768-af3b-018884bde7a9" path="/var/lib/kubelet/pods/583afa7e-d99c-4768-af3b-018884bde7a9/volumes" Dec 03 11:28:55 crc kubenswrapper[4702]: I1203 11:28:55.907957 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:28:55 crc kubenswrapper[4702]: I1203 11:28:55.908668 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:28:55 crc kubenswrapper[4702]: I1203 11:28:55.908722 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:28:55 crc kubenswrapper[4702]: I1203 11:28:55.910264 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"51258198d84bd78a94c0b0549a633061f0278b4c24ceaa27b8c81a77f3277a36"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:28:55 crc kubenswrapper[4702]: I1203 11:28:55.910343 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://51258198d84bd78a94c0b0549a633061f0278b4c24ceaa27b8c81a77f3277a36" gracePeriod=600 Dec 03 11:28:56 crc kubenswrapper[4702]: I1203 11:28:56.120629 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="51258198d84bd78a94c0b0549a633061f0278b4c24ceaa27b8c81a77f3277a36" exitCode=0 Dec 03 11:28:56 crc kubenswrapper[4702]: I1203 11:28:56.120893 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"51258198d84bd78a94c0b0549a633061f0278b4c24ceaa27b8c81a77f3277a36"} Dec 03 11:28:56 crc kubenswrapper[4702]: I1203 11:28:56.121035 4702 scope.go:117] "RemoveContainer" containerID="3ceadc6ee9df8857f4d601383951a406af62e876eb34d70ea87811eb3741ae2d" Dec 03 11:28:56 crc kubenswrapper[4702]: I1203 11:28:56.317510 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5v7lw-config-km7bx"] Dec 03 11:28:56 crc kubenswrapper[4702]: I1203 11:28:56.536903 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.134456 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jm662" event={"ID":"3e2598e5-2410-4a5b-b8b5-f41239c131d1","Type":"ContainerStarted","Data":"87d477f984f9300b594e99220e09af0f886c163b126f4c536f666d04ad1a8659"} Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.136605 4702 generic.go:334] "Generic (PLEG): container finished" podID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerID="c686f29a77a538f6b1cf5e7794e3b873eee6109ca2704c0f01f176cfe26474b3" exitCode=0 Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.136696 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4l92" event={"ID":"b14ee971-d6bc-478a-982a-998bce6d15a1","Type":"ContainerDied","Data":"c686f29a77a538f6b1cf5e7794e3b873eee6109ca2704c0f01f176cfe26474b3"} Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.141198 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03"} Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.143119 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"0ac6a172b25f8544740535938cc2b8c6e0c2ec4f44a5cbad9a8dda102fcc0b9f"} Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.148068 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7b6e2ad9-6424-4857-8dc6-a67f7758151d","Type":"ContainerStarted","Data":"65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09"} Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.150427 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-km7bx" event={"ID":"768ad21b-ecc7-4ede-a09b-d87b376d9bcd","Type":"ContainerStarted","Data":"38f6eadf1b9144c46cc2393ac1c4364175bd42eb25b7113aedbb600488c1ea50"} Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.150480 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-km7bx" event={"ID":"768ad21b-ecc7-4ede-a09b-d87b376d9bcd","Type":"ContainerStarted","Data":"28aef438416a482154426f300469213360dfd4f1554c25b9dbd75fc472ce63a7"} Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.164549 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jm662" podStartSLOduration=15.337290608 podStartE2EDuration="28.164522488s" podCreationTimestamp="2025-12-03 11:28:29 +0000 UTC" firstStartedPulling="2025-12-03 11:28:42.791920806 +0000 UTC m=+1506.627849270" lastFinishedPulling="2025-12-03 11:28:55.619152686 +0000 UTC m=+1519.455081150" observedRunningTime="2025-12-03 11:28:57.155872093 +0000 UTC m=+1520.991800567" watchObservedRunningTime="2025-12-03 11:28:57.164522488 +0000 UTC m=+1521.000450942" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.195820 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.128614797 podStartE2EDuration="11.195788994s" podCreationTimestamp="2025-12-03 11:28:46 +0000 UTC" firstStartedPulling="2025-12-03 11:28:47.666698039 +0000 UTC m=+1511.502626503" lastFinishedPulling="2025-12-03 11:28:55.733872236 +0000 UTC m=+1519.569800700" observedRunningTime="2025-12-03 11:28:57.179099211 +0000 UTC m=+1521.015027675" watchObservedRunningTime="2025-12-03 11:28:57.195788994 +0000 UTC m=+1521.031717458" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.306905 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9k8cb"] Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.308996 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5v7lw-config-km7bx" podStartSLOduration=7.30897414 podStartE2EDuration="7.30897414s" podCreationTimestamp="2025-12-03 11:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:28:57.258248093 +0000 UTC m=+1521.094176577" watchObservedRunningTime="2025-12-03 11:28:57.30897414 +0000 UTC m=+1521.144902614" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.316375 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.401626 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k8cb"] Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.484404 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-utilities\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.484475 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5bbw\" (UniqueName: \"kubernetes.io/projected/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-kube-api-access-g5bbw\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.485122 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-catalog-content\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.587956 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-utilities\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.588037 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5bbw\" (UniqueName: \"kubernetes.io/projected/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-kube-api-access-g5bbw\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.588169 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-catalog-content\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.588642 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-utilities\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.588699 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-catalog-content\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.620741 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5bbw\" (UniqueName: \"kubernetes.io/projected/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-kube-api-access-g5bbw\") pod \"redhat-marketplace-9k8cb\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:57 crc kubenswrapper[4702]: I1203 11:28:57.655639 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:28:58 crc kubenswrapper[4702]: I1203 11:28:58.183412 4702 generic.go:334] "Generic (PLEG): container finished" podID="768ad21b-ecc7-4ede-a09b-d87b376d9bcd" containerID="38f6eadf1b9144c46cc2393ac1c4364175bd42eb25b7113aedbb600488c1ea50" exitCode=0 Dec 03 11:28:58 crc kubenswrapper[4702]: I1203 11:28:58.183522 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-km7bx" event={"ID":"768ad21b-ecc7-4ede-a09b-d87b376d9bcd","Type":"ContainerDied","Data":"38f6eadf1b9144c46cc2393ac1c4364175bd42eb25b7113aedbb600488c1ea50"} Dec 03 11:28:58 crc kubenswrapper[4702]: I1203 11:28:58.398166 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k8cb"] Dec 03 11:28:58 crc kubenswrapper[4702]: W1203 11:28:58.800113 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba3a34cc_85f8_4aa9_b88e_b582a2558b9c.slice/crio-1e46cbe54a60909661821049ed73f103e1e987b3a65f9057591f688425d5d1f9 WatchSource:0}: Error finding container 1e46cbe54a60909661821049ed73f103e1e987b3a65f9057591f688425d5d1f9: Status 404 returned error can't find the container with id 1e46cbe54a60909661821049ed73f103e1e987b3a65f9057591f688425d5d1f9 Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.213837 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hq9rm" event={"ID":"34d869ae-eae9-4bf6-b05e-cf40504ccdb6","Type":"ContainerStarted","Data":"e0cb0c5f34421fd7d055849726b234e30a1295b20d66bf6adc32dd351bc8adbe"} Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.217329 4702 generic.go:334] "Generic (PLEG): container finished" podID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerID="c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30" exitCode=0 Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.218845 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k8cb" event={"ID":"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c","Type":"ContainerDied","Data":"c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30"} Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.218960 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k8cb" event={"ID":"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c","Type":"ContainerStarted","Data":"1e46cbe54a60909661821049ed73f103e1e987b3a65f9057591f688425d5d1f9"} Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.239325 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hq9rm" podStartSLOduration=7.6182495249999995 podStartE2EDuration="42.239291834s" podCreationTimestamp="2025-12-03 11:28:17 +0000 UTC" firstStartedPulling="2025-12-03 11:28:22.958462612 +0000 UTC m=+1486.794391076" lastFinishedPulling="2025-12-03 11:28:57.579504921 +0000 UTC m=+1521.415433385" observedRunningTime="2025-12-03 11:28:59.235073404 +0000 UTC m=+1523.071001878" watchObservedRunningTime="2025-12-03 11:28:59.239291834 +0000 UTC m=+1523.075220298" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.731044 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.851964 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-additional-scripts\") pod \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.852359 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2m8f\" (UniqueName: \"kubernetes.io/projected/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-kube-api-access-p2m8f\") pod \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.852471 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run-ovn\") pod \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.852536 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run\") pod \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.852671 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-log-ovn\") pod \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.852801 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-scripts\") pod \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\" (UID: \"768ad21b-ecc7-4ede-a09b-d87b376d9bcd\") " Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.852895 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "768ad21b-ecc7-4ede-a09b-d87b376d9bcd" (UID: "768ad21b-ecc7-4ede-a09b-d87b376d9bcd"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.853091 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run" (OuterVolumeSpecName: "var-run") pod "768ad21b-ecc7-4ede-a09b-d87b376d9bcd" (UID: "768ad21b-ecc7-4ede-a09b-d87b376d9bcd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.853148 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "768ad21b-ecc7-4ede-a09b-d87b376d9bcd" (UID: "768ad21b-ecc7-4ede-a09b-d87b376d9bcd"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.855166 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-scripts" (OuterVolumeSpecName: "scripts") pod "768ad21b-ecc7-4ede-a09b-d87b376d9bcd" (UID: "768ad21b-ecc7-4ede-a09b-d87b376d9bcd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.855374 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "768ad21b-ecc7-4ede-a09b-d87b376d9bcd" (UID: "768ad21b-ecc7-4ede-a09b-d87b376d9bcd"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.855539 4702 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.855557 4702 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.855569 4702 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.855578 4702 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.855589 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.861334 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-kube-api-access-p2m8f" (OuterVolumeSpecName: "kube-api-access-p2m8f") pod "768ad21b-ecc7-4ede-a09b-d87b376d9bcd" (UID: "768ad21b-ecc7-4ede-a09b-d87b376d9bcd"). InnerVolumeSpecName "kube-api-access-p2m8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:28:59 crc kubenswrapper[4702]: I1203 11:28:59.958195 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2m8f\" (UniqueName: \"kubernetes.io/projected/768ad21b-ecc7-4ede-a09b-d87b376d9bcd-kube-api-access-p2m8f\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.232480 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5v7lw-config-km7bx" event={"ID":"768ad21b-ecc7-4ede-a09b-d87b376d9bcd","Type":"ContainerDied","Data":"28aef438416a482154426f300469213360dfd4f1554c25b9dbd75fc472ce63a7"} Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.232866 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28aef438416a482154426f300469213360dfd4f1554c25b9dbd75fc472ce63a7" Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.232556 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5v7lw-config-km7bx" Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.236358 4702 generic.go:334] "Generic (PLEG): container finished" podID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerID="7ab9702d1eba58bdcbd42289ca5f8483dd23bc3af2faf4d8218b3f9ea4135064" exitCode=0 Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.236421 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4l92" event={"ID":"b14ee971-d6bc-478a-982a-998bce6d15a1","Type":"ContainerDied","Data":"7ab9702d1eba58bdcbd42289ca5f8483dd23bc3af2faf4d8218b3f9ea4135064"} Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.244922 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"43b09f44f41dd88f9f18d1e9566cbdc84c4825e2dc3d2f84d2c1e19e3ab44330"} Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.244989 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"f30772212a976fb2498258f56d5ae87a6b2175640f2c5ecb06dd2afd7b244ad0"} Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.855539 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5v7lw-config-km7bx"] Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.890586 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5v7lw-config-km7bx"] Dec 03 11:29:00 crc kubenswrapper[4702]: I1203 11:29:00.942124 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="768ad21b-ecc7-4ede-a09b-d87b376d9bcd" path="/var/lib/kubelet/pods/768ad21b-ecc7-4ede-a09b-d87b376d9bcd/volumes" Dec 03 11:29:04 crc kubenswrapper[4702]: I1203 11:29:04.290510 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"3a57f04a0c0c5263e247792ade4a8b1f3222712993221daa0c354b6d6ef8cd99"} Dec 03 11:29:04 crc kubenswrapper[4702]: I1203 11:29:04.291370 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"051ae7dc660a9c2902615b1052d580688c52f7a067e9fbeea42315915c50580d"} Dec 03 11:29:05 crc kubenswrapper[4702]: I1203 11:29:05.307966 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4l92" event={"ID":"b14ee971-d6bc-478a-982a-998bce6d15a1","Type":"ContainerStarted","Data":"2bf7d47d8214510a9644b94fba3cd63a427233d94e0cb83dad93d7620cba1ae5"} Dec 03 11:29:05 crc kubenswrapper[4702]: I1203 11:29:05.310951 4702 generic.go:334] "Generic (PLEG): container finished" podID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerID="a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3" exitCode=0 Dec 03 11:29:05 crc kubenswrapper[4702]: I1203 11:29:05.311031 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k8cb" event={"ID":"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c","Type":"ContainerDied","Data":"a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3"} Dec 03 11:29:05 crc kubenswrapper[4702]: I1203 11:29:05.336292 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v4l92" podStartSLOduration=11.359691399 podStartE2EDuration="18.336267785s" podCreationTimestamp="2025-12-03 11:28:47 +0000 UTC" firstStartedPulling="2025-12-03 11:28:57.139068997 +0000 UTC m=+1520.974997461" lastFinishedPulling="2025-12-03 11:29:04.115645383 +0000 UTC m=+1527.951573847" observedRunningTime="2025-12-03 11:29:05.328267209 +0000 UTC m=+1529.164195683" watchObservedRunningTime="2025-12-03 11:29:05.336267785 +0000 UTC m=+1529.172196249" Dec 03 11:29:06 crc kubenswrapper[4702]: I1203 11:29:06.327668 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"c6d18c8da1aa23e1210cdfd05f5203d410c4f089358922adbb1207eb41bfe324"} Dec 03 11:29:07 crc kubenswrapper[4702]: I1203 11:29:07.348373 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"6f058a33e49ddefbd67493cc4614ab46979ed8c5db5b1fe75eb60cb85e6d9176"} Dec 03 11:29:07 crc kubenswrapper[4702]: I1203 11:29:07.348713 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"a687d39168ea480a3c5afddba8f1c673642f5fb3577a51f99147c29aeaf3196c"} Dec 03 11:29:07 crc kubenswrapper[4702]: I1203 11:29:07.378159 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k8cb" event={"ID":"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c","Type":"ContainerStarted","Data":"99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4"} Dec 03 11:29:07 crc kubenswrapper[4702]: I1203 11:29:07.409015 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9k8cb" podStartSLOduration=3.053163097 podStartE2EDuration="10.408723857s" podCreationTimestamp="2025-12-03 11:28:57 +0000 UTC" firstStartedPulling="2025-12-03 11:28:59.22008895 +0000 UTC m=+1523.056017414" lastFinishedPulling="2025-12-03 11:29:06.57564971 +0000 UTC m=+1530.411578174" observedRunningTime="2025-12-03 11:29:07.398914949 +0000 UTC m=+1531.234843413" watchObservedRunningTime="2025-12-03 11:29:07.408723857 +0000 UTC m=+1531.244652321" Dec 03 11:29:07 crc kubenswrapper[4702]: I1203 11:29:07.656375 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:29:07 crc kubenswrapper[4702]: I1203 11:29:07.656445 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:29:08 crc kubenswrapper[4702]: I1203 11:29:08.113699 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:29:08 crc kubenswrapper[4702]: I1203 11:29:08.114116 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:29:08 crc kubenswrapper[4702]: I1203 11:29:08.399087 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"f0c1ce4c2586fb3d4769e9a05532068bce11aef0d3ad9b190f2edf61b177b212"} Dec 03 11:29:08 crc kubenswrapper[4702]: I1203 11:29:08.715222 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-9k8cb" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="registry-server" probeResult="failure" output=< Dec 03 11:29:08 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:29:08 crc kubenswrapper[4702]: > Dec 03 11:29:09 crc kubenswrapper[4702]: I1203 11:29:09.164118 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-v4l92" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="registry-server" probeResult="failure" output=< Dec 03 11:29:09 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:29:09 crc kubenswrapper[4702]: > Dec 03 11:29:11 crc kubenswrapper[4702]: I1203 11:29:11.438658 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"7116113ce2d7fffe0e8eae998142328f31e5d1d692f37800bf56cc69b3cf649a"} Dec 03 11:29:11 crc kubenswrapper[4702]: I1203 11:29:11.439031 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"1e188cfc3621cbdbe9a560c64b7e284cd1c63e265378d7514b56093b2299cd53"} Dec 03 11:29:12 crc kubenswrapper[4702]: I1203 11:29:12.476986 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"1b1fbe722611f236c6916676412d6da65154eed019c656110adb4c11c2e0e57f"} Dec 03 11:29:12 crc kubenswrapper[4702]: I1203 11:29:12.477360 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"e8f7f0b0b7a6669a67663adc914e44724d0ad65d2af43ddc613580c243324310"} Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.491321 4702 generic.go:334] "Generic (PLEG): container finished" podID="3e2598e5-2410-4a5b-b8b5-f41239c131d1" containerID="87d477f984f9300b594e99220e09af0f886c163b126f4c536f666d04ad1a8659" exitCode=0 Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.491402 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jm662" event={"ID":"3e2598e5-2410-4a5b-b8b5-f41239c131d1","Type":"ContainerDied","Data":"87d477f984f9300b594e99220e09af0f886c163b126f4c536f666d04ad1a8659"} Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.502057 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"4212e842d860df9046f35bb2897333ed131df4805cda77350f4f4e2fb578a049"} Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.502116 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"d8c167ec069195421c62e4c960abb19d227f132e529f26864e0a21a2326454a3"} Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.502134 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"3892571c-86ff-4259-beaa-6033dcfda204","Type":"ContainerStarted","Data":"84d614bad64fc0621cc8d676ba2dd0a16a0091c1f40e561630bca7f4f342570b"} Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.554833 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=74.285683648 podStartE2EDuration="1m28.554803989s" podCreationTimestamp="2025-12-03 11:27:45 +0000 UTC" firstStartedPulling="2025-12-03 11:28:56.579386335 +0000 UTC m=+1520.415314799" lastFinishedPulling="2025-12-03 11:29:10.848506686 +0000 UTC m=+1534.684435140" observedRunningTime="2025-12-03 11:29:13.550997832 +0000 UTC m=+1537.386926296" watchObservedRunningTime="2025-12-03 11:29:13.554803989 +0000 UTC m=+1537.390732453" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.859283 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-f8jw9"] Dec 03 11:29:13 crc kubenswrapper[4702]: E1203 11:29:13.862066 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="768ad21b-ecc7-4ede-a09b-d87b376d9bcd" containerName="ovn-config" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.862114 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="768ad21b-ecc7-4ede-a09b-d87b376d9bcd" containerName="ovn-config" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.862598 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="768ad21b-ecc7-4ede-a09b-d87b376d9bcd" containerName="ovn-config" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.864090 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.870388 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-f8jw9"] Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.895258 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.982101 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.982171 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-config\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.982249 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-svc\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.982352 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlrrw\" (UniqueName: \"kubernetes.io/projected/6a7d4216-b31e-478d-9bc0-0656a1480178-kube-api-access-nlrrw\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.982434 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:13 crc kubenswrapper[4702]: I1203 11:29:13.982496 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.084618 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.084735 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.084939 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.084974 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-config\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.085023 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-svc\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.085108 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlrrw\" (UniqueName: \"kubernetes.io/projected/6a7d4216-b31e-478d-9bc0-0656a1480178-kube-api-access-nlrrw\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.086189 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.086699 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-svc\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.086903 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.087835 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.087941 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-config\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.109386 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlrrw\" (UniqueName: \"kubernetes.io/projected/6a7d4216-b31e-478d-9bc0-0656a1480178-kube-api-access-nlrrw\") pod \"dnsmasq-dns-764c5664d7-f8jw9\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.189764 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-85ks2"] Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.192233 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.201483 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-85ks2"] Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.209783 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.290034 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-utilities\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.290146 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-catalog-content\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.290248 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rkk\" (UniqueName: \"kubernetes.io/projected/88ad8517-bb0c-42f2-9416-375284ef0db7-kube-api-access-t4rkk\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.395303 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4rkk\" (UniqueName: \"kubernetes.io/projected/88ad8517-bb0c-42f2-9416-375284ef0db7-kube-api-access-t4rkk\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.395457 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-utilities\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.395547 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-catalog-content\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.396659 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-catalog-content\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.396802 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-utilities\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.422572 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4rkk\" (UniqueName: \"kubernetes.io/projected/88ad8517-bb0c-42f2-9416-375284ef0db7-kube-api-access-t4rkk\") pod \"certified-operators-85ks2\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.530370 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:14 crc kubenswrapper[4702]: I1203 11:29:14.854804 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-f8jw9"] Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.103318 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jm662" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.140456 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-config-data\") pod \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.140718 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-combined-ca-bundle\") pod \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.151006 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqf5m\" (UniqueName: \"kubernetes.io/projected/3e2598e5-2410-4a5b-b8b5-f41239c131d1-kube-api-access-rqf5m\") pod \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\" (UID: \"3e2598e5-2410-4a5b-b8b5-f41239c131d1\") " Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.160799 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2598e5-2410-4a5b-b8b5-f41239c131d1-kube-api-access-rqf5m" (OuterVolumeSpecName: "kube-api-access-rqf5m") pod "3e2598e5-2410-4a5b-b8b5-f41239c131d1" (UID: "3e2598e5-2410-4a5b-b8b5-f41239c131d1"). InnerVolumeSpecName "kube-api-access-rqf5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.227383 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e2598e5-2410-4a5b-b8b5-f41239c131d1" (UID: "3e2598e5-2410-4a5b-b8b5-f41239c131d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.255061 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.255100 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqf5m\" (UniqueName: \"kubernetes.io/projected/3e2598e5-2410-4a5b-b8b5-f41239c131d1-kube-api-access-rqf5m\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.286254 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-config-data" (OuterVolumeSpecName: "config-data") pod "3e2598e5-2410-4a5b-b8b5-f41239c131d1" (UID: "3e2598e5-2410-4a5b-b8b5-f41239c131d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.357070 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e2598e5-2410-4a5b-b8b5-f41239c131d1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.529577 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jm662" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.529601 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jm662" event={"ID":"3e2598e5-2410-4a5b-b8b5-f41239c131d1","Type":"ContainerDied","Data":"10991fccf96e269f74fa4b211930e178af14dc6f29e842ce5053de28d217db9b"} Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.529692 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10991fccf96e269f74fa4b211930e178af14dc6f29e842ce5053de28d217db9b" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.532329 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" event={"ID":"6a7d4216-b31e-478d-9bc0-0656a1480178","Type":"ContainerStarted","Data":"c764b081da553d44063df3fbc3e0ed635d5a9e14bab2eba99a467cbfd3aff8dd"} Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.546080 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-85ks2"] Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.867340 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-f8jw9"] Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.957093 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-vljtz"] Dec 03 11:29:15 crc kubenswrapper[4702]: E1203 11:29:15.957680 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2598e5-2410-4a5b-b8b5-f41239c131d1" containerName="keystone-db-sync" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.957698 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2598e5-2410-4a5b-b8b5-f41239c131d1" containerName="keystone-db-sync" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.958022 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2598e5-2410-4a5b-b8b5-f41239c131d1" containerName="keystone-db-sync" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.959529 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.993625 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-vljtz"] Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.994908 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.995010 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.995152 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s85x5\" (UniqueName: \"kubernetes.io/projected/a8812c45-da81-47ae-ba43-e181f2545cef-kube-api-access-s85x5\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.995175 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.995207 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-svc\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:15 crc kubenswrapper[4702]: I1203 11:29:15.995237 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-config\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.076391 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6r87r"] Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.078099 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.084203 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.085256 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.095088 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnt45" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.095569 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.095665 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097414 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s85x5\" (UniqueName: \"kubernetes.io/projected/a8812c45-da81-47ae-ba43-e181f2545cef-kube-api-access-s85x5\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097449 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097485 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-credential-keys\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097531 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-svc\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097564 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x54r5\" (UniqueName: \"kubernetes.io/projected/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-kube-api-access-x54r5\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097594 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-config\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097682 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097725 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-combined-ca-bundle\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097806 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097857 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-scripts\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097892 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-config-data\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.097974 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-fernet-keys\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.099293 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.099993 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-svc\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.100738 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.100830 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-config\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.101543 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.136838 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6r87r"] Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.218049 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s85x5\" (UniqueName: \"kubernetes.io/projected/a8812c45-da81-47ae-ba43-e181f2545cef-kube-api-access-s85x5\") pod \"dnsmasq-dns-5959f8865f-vljtz\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.219235 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-scripts\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.219297 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-config-data\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.219392 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-fernet-keys\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.219456 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-credential-keys\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.219491 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x54r5\" (UniqueName: \"kubernetes.io/projected/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-kube-api-access-x54r5\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.219607 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-combined-ca-bundle\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.229139 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-scripts\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.233898 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-combined-ca-bundle\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.238589 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-config-data\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.239891 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-wjjt9"] Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.241895 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.245190 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-credential-keys\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.246239 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-fernet-keys\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.260133 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.260400 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-9s85m" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.268604 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x54r5\" (UniqueName: \"kubernetes.io/projected/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-kube-api-access-x54r5\") pod \"keystone-bootstrap-6r87r\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.270665 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wjjt9"] Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.317264 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.321283 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5qhx\" (UniqueName: \"kubernetes.io/projected/6c60c306-2c56-44e4-8482-e5a72eccd765-kube-api-access-n5qhx\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.321370 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-config-data\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.321424 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-combined-ca-bundle\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.424858 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5qhx\" (UniqueName: \"kubernetes.io/projected/6c60c306-2c56-44e4-8482-e5a72eccd765-kube-api-access-n5qhx\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.425269 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-config-data\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.425345 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-combined-ca-bundle\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.488277 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.517136 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5qhx\" (UniqueName: \"kubernetes.io/projected/6c60c306-2c56-44e4-8482-e5a72eccd765-kube-api-access-n5qhx\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.533971 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-combined-ca-bundle\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.536888 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-config-data\") pod \"heat-db-sync-wjjt9\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.866099 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-w8wgv"] Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.870890 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wjjt9" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.920358 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w8wgv"] Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.920742 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.952970 4702 generic.go:334] "Generic (PLEG): container finished" podID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerID="b6cb2c393db156cfe6ef69e7d47ef8d29efb37ead84f5c7662dae50e7c797d69" exitCode=0 Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.953098 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-72dgz" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.953348 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.953529 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 11:29:16 crc kubenswrapper[4702]: I1203 11:29:16.977507 4702 generic.go:334] "Generic (PLEG): container finished" podID="6a7d4216-b31e-478d-9bc0-0656a1480178" containerID="f7fe88f3164215b82abcb2c6aea6d3963967b6a09a8e72f4bede41278701dd62" exitCode=0 Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.024610 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ks2" event={"ID":"88ad8517-bb0c-42f2-9416-375284ef0db7","Type":"ContainerDied","Data":"b6cb2c393db156cfe6ef69e7d47ef8d29efb37ead84f5c7662dae50e7c797d69"} Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.024686 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-ds27t"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.037721 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ks2" event={"ID":"88ad8517-bb0c-42f2-9416-375284ef0db7","Type":"ContainerStarted","Data":"47df90550d3fb59666d91120d11a49449c43ddc4583c7636051c0abcbf23a884"} Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.037798 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" event={"ID":"6a7d4216-b31e-478d-9bc0-0656a1480178","Type":"ContainerDied","Data":"f7fe88f3164215b82abcb2c6aea6d3963967b6a09a8e72f4bede41278701dd62"} Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.037909 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.041331 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-l7bgx" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.061603 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-vljtz"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.084167 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-combined-ca-bundle\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.084384 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-config\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.084489 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5n4w\" (UniqueName: \"kubernetes.io/projected/32433fde-8c4c-43af-94f5-ab732096cd9d-kube-api-access-h5n4w\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.085980 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ds27t"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.087153 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.100814 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-tqs2b"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.102983 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.127742 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gt7kh" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.128177 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.131301 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jtqd2"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.133497 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.138432 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.138847 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p7m48" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.139064 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.139210 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.183479 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jtqd2"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.199580 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-db-sync-config-data\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.199847 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-config\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.200060 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x6vj\" (UniqueName: \"kubernetes.io/projected/c2d4cceb-3a17-486b-8718-897e52ea39cc-kube-api-access-4x6vj\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.200148 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5n4w\" (UniqueName: \"kubernetes.io/projected/32433fde-8c4c-43af-94f5-ab732096cd9d-kube-api-access-h5n4w\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.200235 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-combined-ca-bundle\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.200358 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-combined-ca-bundle\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.220139 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-combined-ca-bundle\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.229042 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-config\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.237742 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tqs2b"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.259226 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5n4w\" (UniqueName: \"kubernetes.io/projected/32433fde-8c4c-43af-94f5-ab732096cd9d-kube-api-access-h5n4w\") pod \"neutron-db-sync-w8wgv\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.293876 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-gdv8p"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.307503 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.308901 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-scripts\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.308987 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-db-sync-config-data\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309030 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-config-data\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309074 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p58fb\" (UniqueName: \"kubernetes.io/projected/20bf2147-401b-457b-ad27-3c893be5fa2c-kube-api-access-p58fb\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309246 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244qk\" (UniqueName: \"kubernetes.io/projected/0ca96186-ab1f-41b3-a9f4-c89220b757da-kube-api-access-244qk\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309297 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-combined-ca-bundle\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309361 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-combined-ca-bundle\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309407 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x6vj\" (UniqueName: \"kubernetes.io/projected/c2d4cceb-3a17-486b-8718-897e52ea39cc-kube-api-access-4x6vj\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309460 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-scripts\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309490 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-config-data\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309516 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20bf2147-401b-457b-ad27-3c893be5fa2c-etc-machine-id\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309603 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-combined-ca-bundle\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309692 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-db-sync-config-data\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.309750 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ca96186-ab1f-41b3-a9f4-c89220b757da-logs\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.313170 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-gdv8p"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.318587 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.335796 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-db-sync-config-data\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.337352 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.339827 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-combined-ca-bundle\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.344309 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.350508 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.355323 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.361483 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x6vj\" (UniqueName: \"kubernetes.io/projected/c2d4cceb-3a17-486b-8718-897e52ea39cc-kube-api-access-4x6vj\") pod \"barbican-db-sync-ds27t\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.406443 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.413114 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ca96186-ab1f-41b3-a9f4-c89220b757da-logs\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.413207 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-run-httpd\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.416742 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ca96186-ab1f-41b3-a9f4-c89220b757da-logs\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.418374 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.418507 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.418579 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5m9x\" (UniqueName: \"kubernetes.io/projected/8d966268-b412-4449-beb2-56619ab95323-kube-api-access-m5m9x\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.418622 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-scripts\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.418656 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.418729 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-config-data\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.418863 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p58fb\" (UniqueName: \"kubernetes.io/projected/20bf2147-401b-457b-ad27-3c893be5fa2c-kube-api-access-p58fb\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.418966 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.426070 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-config-data\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.426779 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-scripts\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.438556 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.438705 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244qk\" (UniqueName: \"kubernetes.io/projected/0ca96186-ab1f-41b3-a9f4-c89220b757da-kube-api-access-244qk\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.438835 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-log-httpd\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.438960 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-config-data\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439056 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-combined-ca-bundle\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439091 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msk6c\" (UniqueName: \"kubernetes.io/projected/4b53ad8d-06e2-4511-b33a-2ff6c6209861-kube-api-access-msk6c\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439182 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-combined-ca-bundle\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439210 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-config\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439243 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439337 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-scripts\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439374 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-config-data\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439406 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20bf2147-401b-457b-ad27-3c893be5fa2c-etc-machine-id\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439587 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-db-sync-config-data\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.439613 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-scripts\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.440513 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20bf2147-401b-457b-ad27-3c893be5fa2c-etc-machine-id\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.445637 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-db-sync-config-data\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.446508 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p58fb\" (UniqueName: \"kubernetes.io/projected/20bf2147-401b-457b-ad27-3c893be5fa2c-kube-api-access-p58fb\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.453707 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-config-data\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.459261 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-combined-ca-bundle\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.462550 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-scripts\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.469059 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244qk\" (UniqueName: \"kubernetes.io/projected/0ca96186-ab1f-41b3-a9f4-c89220b757da-kube-api-access-244qk\") pod \"placement-db-sync-jtqd2\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.472976 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ds27t" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.485338 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-combined-ca-bundle\") pod \"cinder-db-sync-tqs2b\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.516704 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.542904 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-run-httpd\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543107 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543162 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543202 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5m9x\" (UniqueName: \"kubernetes.io/projected/8d966268-b412-4449-beb2-56619ab95323-kube-api-access-m5m9x\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543230 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543382 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543462 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543540 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-log-httpd\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543567 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-config-data\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543611 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msk6c\" (UniqueName: \"kubernetes.io/projected/4b53ad8d-06e2-4511-b33a-2ff6c6209861-kube-api-access-msk6c\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543895 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-config\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.543954 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.544169 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-scripts\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.545909 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.549065 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-config\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.549397 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.551503 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jtqd2" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.552444 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-log-httpd\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.556121 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.558501 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.562456 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-run-httpd\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.570925 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-scripts\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.573619 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.578273 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-config-data\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.587841 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.591064 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5m9x\" (UniqueName: \"kubernetes.io/projected/8d966268-b412-4449-beb2-56619ab95323-kube-api-access-m5m9x\") pod \"dnsmasq-dns-58dd9ff6bc-gdv8p\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.596582 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msk6c\" (UniqueName: \"kubernetes.io/projected/4b53ad8d-06e2-4511-b33a-2ff6c6209861-kube-api-access-msk6c\") pod \"ceilometer-0\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.616492 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:29:17 crc kubenswrapper[4702]: E1203 11:29:17.712951 4702 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 03 11:29:17 crc kubenswrapper[4702]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/6a7d4216-b31e-478d-9bc0-0656a1480178/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 03 11:29:17 crc kubenswrapper[4702]: > podSandboxID="c764b081da553d44063df3fbc3e0ed635d5a9e14bab2eba99a467cbfd3aff8dd" Dec 03 11:29:17 crc kubenswrapper[4702]: E1203 11:29:17.713232 4702 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 03 11:29:17 crc kubenswrapper[4702]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66chbbh56dh7fhfh68chf9hfdhbdh587h5b9h568h68fh77h5b5h559h577h687h574h5d5h584h8chd9hb4h66h566h545h699h564h568h66fhc9q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nlrrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-764c5664d7-f8jw9_openstack(6a7d4216-b31e-478d-9bc0-0656a1480178): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/6a7d4216-b31e-478d-9bc0-0656a1480178/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 03 11:29:17 crc kubenswrapper[4702]: > logger="UnhandledError" Dec 03 11:29:17 crc kubenswrapper[4702]: E1203 11:29:17.714894 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/6a7d4216-b31e-478d-9bc0-0656a1480178/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" podUID="6a7d4216-b31e-478d-9bc0-0656a1480178" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.854806 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:29:17 crc kubenswrapper[4702]: W1203 11:29:17.872422 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8812c45_da81_47ae_ba43_e181f2545cef.slice/crio-97b046d0a9a47f0a2966ccac8f71c43a9c05163d3bf515d1dc680bc3428ff9eb WatchSource:0}: Error finding container 97b046d0a9a47f0a2966ccac8f71c43a9c05163d3bf515d1dc680bc3428ff9eb: Status 404 returned error can't find the container with id 97b046d0a9a47f0a2966ccac8f71c43a9c05163d3bf515d1dc680bc3428ff9eb Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.892204 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:17 crc kubenswrapper[4702]: I1203 11:29:17.942238 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-vljtz"] Dec 03 11:29:18 crc kubenswrapper[4702]: I1203 11:29:18.038992 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-wjjt9"] Dec 03 11:29:18 crc kubenswrapper[4702]: I1203 11:29:18.050253 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-vljtz" event={"ID":"a8812c45-da81-47ae-ba43-e181f2545cef","Type":"ContainerStarted","Data":"97b046d0a9a47f0a2966ccac8f71c43a9c05163d3bf515d1dc680bc3428ff9eb"} Dec 03 11:29:18 crc kubenswrapper[4702]: I1203 11:29:18.080443 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:29:18 crc kubenswrapper[4702]: I1203 11:29:18.477508 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:29:18 crc kubenswrapper[4702]: I1203 11:29:18.638193 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6r87r"] Dec 03 11:29:18 crc kubenswrapper[4702]: I1203 11:29:18.638809 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:29:18 crc kubenswrapper[4702]: I1203 11:29:18.816050 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w8wgv"] Dec 03 11:29:18 crc kubenswrapper[4702]: W1203 11:29:18.818122 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32433fde_8c4c_43af_94f5_ab732096cd9d.slice/crio-ade55c30b236ea1604802a84c6aacecb6c988cd8ce4d073ac306e188a924e494 WatchSource:0}: Error finding container ade55c30b236ea1604802a84c6aacecb6c988cd8ce4d073ac306e188a924e494: Status 404 returned error can't find the container with id ade55c30b236ea1604802a84c6aacecb6c988cd8ce4d073ac306e188a924e494 Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.090065 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ks2" event={"ID":"88ad8517-bb0c-42f2-9416-375284ef0db7","Type":"ContainerStarted","Data":"d1fb32ddc434944fdbba7cf06527f6e968de20e318c1a12e550b9ffa0cafe39b"} Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.100490 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w8wgv" event={"ID":"32433fde-8c4c-43af-94f5-ab732096cd9d","Type":"ContainerStarted","Data":"ade55c30b236ea1604802a84c6aacecb6c988cd8ce4d073ac306e188a924e494"} Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.103069 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wjjt9" event={"ID":"6c60c306-2c56-44e4-8482-e5a72eccd765","Type":"ContainerStarted","Data":"472dee51e6accbe376233fc11179ab4ff3110e08a36312906073772b1904a0bc"} Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.111084 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6r87r" event={"ID":"4dcfb963-e8b9-4053-8de9-b1d127e6abfa","Type":"ContainerStarted","Data":"0eb02e16f1ab1ecc0d36180ead20ce29af9cc40f62cb4f36a753b67e07f996a3"} Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.137817 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-ds27t"] Dec 03 11:29:19 crc kubenswrapper[4702]: W1203 11:29:19.283418 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2d4cceb_3a17_486b_8718_897e52ea39cc.slice/crio-4cf0934be41d2a5d8bd9efca76dda97c0614e1877ecc9a99c33bfccf2362222d WatchSource:0}: Error finding container 4cf0934be41d2a5d8bd9efca76dda97c0614e1877ecc9a99c33bfccf2362222d: Status 404 returned error can't find the container with id 4cf0934be41d2a5d8bd9efca76dda97c0614e1877ecc9a99c33bfccf2362222d Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.417650 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-tqs2b"] Dec 03 11:29:19 crc kubenswrapper[4702]: W1203 11:29:19.440026 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20bf2147_401b_457b_ad27_3c893be5fa2c.slice/crio-1959755cbf3e2aa23fb6f744b6a0b0a7d3c175ab7c17d5fa38f5c8beb96ccf63 WatchSource:0}: Error finding container 1959755cbf3e2aa23fb6f744b6a0b0a7d3c175ab7c17d5fa38f5c8beb96ccf63: Status 404 returned error can't find the container with id 1959755cbf3e2aa23fb6f744b6a0b0a7d3c175ab7c17d5fa38f5c8beb96ccf63 Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.543170 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.637901 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-sb\") pod \"6a7d4216-b31e-478d-9bc0-0656a1480178\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.638714 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-nb\") pod \"6a7d4216-b31e-478d-9bc0-0656a1480178\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.639859 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-swift-storage-0\") pod \"6a7d4216-b31e-478d-9bc0-0656a1480178\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.640102 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-config\") pod \"6a7d4216-b31e-478d-9bc0-0656a1480178\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.640340 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-svc\") pod \"6a7d4216-b31e-478d-9bc0-0656a1480178\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.640495 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlrrw\" (UniqueName: \"kubernetes.io/projected/6a7d4216-b31e-478d-9bc0-0656a1480178-kube-api-access-nlrrw\") pod \"6a7d4216-b31e-478d-9bc0-0656a1480178\" (UID: \"6a7d4216-b31e-478d-9bc0-0656a1480178\") " Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.668101 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a7d4216-b31e-478d-9bc0-0656a1480178-kube-api-access-nlrrw" (OuterVolumeSpecName: "kube-api-access-nlrrw") pod "6a7d4216-b31e-478d-9bc0-0656a1480178" (UID: "6a7d4216-b31e-478d-9bc0-0656a1480178"). InnerVolumeSpecName "kube-api-access-nlrrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.801120 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlrrw\" (UniqueName: \"kubernetes.io/projected/6a7d4216-b31e-478d-9bc0-0656a1480178-kube-api-access-nlrrw\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.893342 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a7d4216-b31e-478d-9bc0-0656a1480178" (UID: "6a7d4216-b31e-478d-9bc0-0656a1480178"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.915211 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a7d4216-b31e-478d-9bc0-0656a1480178" (UID: "6a7d4216-b31e-478d-9bc0-0656a1480178"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.919277 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.919429 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.933629 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a7d4216-b31e-478d-9bc0-0656a1480178" (UID: "6a7d4216-b31e-478d-9bc0-0656a1480178"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.952069 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6a7d4216-b31e-478d-9bc0-0656a1480178" (UID: "6a7d4216-b31e-478d-9bc0-0656a1480178"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.975855 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v4l92"] Dec 03 11:29:19 crc kubenswrapper[4702]: I1203 11:29:19.979832 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-config" (OuterVolumeSpecName: "config") pod "6a7d4216-b31e-478d-9bc0-0656a1480178" (UID: "6a7d4216-b31e-478d-9bc0-0656a1480178"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.022024 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.022085 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.022100 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a7d4216-b31e-478d-9bc0-0656a1480178-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.077699 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jtqd2"] Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.099220 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.157880 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" event={"ID":"6a7d4216-b31e-478d-9bc0-0656a1480178","Type":"ContainerDied","Data":"c764b081da553d44063df3fbc3e0ed635d5a9e14bab2eba99a467cbfd3aff8dd"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.157958 4702 scope.go:117] "RemoveContainer" containerID="f7fe88f3164215b82abcb2c6aea6d3963967b6a09a8e72f4bede41278701dd62" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.158272 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-f8jw9" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.173087 4702 generic.go:334] "Generic (PLEG): container finished" podID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerID="d1fb32ddc434944fdbba7cf06527f6e968de20e318c1a12e550b9ffa0cafe39b" exitCode=0 Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.173237 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ks2" event={"ID":"88ad8517-bb0c-42f2-9416-375284ef0db7","Type":"ContainerDied","Data":"d1fb32ddc434944fdbba7cf06527f6e968de20e318c1a12e550b9ffa0cafe39b"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.202372 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w8wgv" event={"ID":"32433fde-8c4c-43af-94f5-ab732096cd9d","Type":"ContainerStarted","Data":"cc7bab6c4bcdb67bd28ac08cc403f78ca5b1748a0cc502b406e0ff94a7278db1"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.209183 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-gdv8p"] Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.248435 4702 generic.go:334] "Generic (PLEG): container finished" podID="a8812c45-da81-47ae-ba43-e181f2545cef" containerID="a8744a292b5863c272917f2f96874077ab9ca430440504bea9e8e3360bf266ba" exitCode=0 Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.248547 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-vljtz" event={"ID":"a8812c45-da81-47ae-ba43-e181f2545cef","Type":"ContainerDied","Data":"a8744a292b5863c272917f2f96874077ab9ca430440504bea9e8e3360bf266ba"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.281385 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6r87r" event={"ID":"4dcfb963-e8b9-4053-8de9-b1d127e6abfa","Type":"ContainerStarted","Data":"6d0ecece3d77561615396eefaec02567322e8c7aff8df432d1ee6b49f4aaa42b"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.311671 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jtqd2" event={"ID":"0ca96186-ab1f-41b3-a9f4-c89220b757da","Type":"ContainerStarted","Data":"62443cf09cb8c391a135e49ef9eb775cfe490225fe5c5263328f7583b8420f66"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.349094 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-w8wgv" podStartSLOduration=4.347722482 podStartE2EDuration="4.347722482s" podCreationTimestamp="2025-12-03 11:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:20.254034218 +0000 UTC m=+1544.089962682" watchObservedRunningTime="2025-12-03 11:29:20.347722482 +0000 UTC m=+1544.183650946" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.354470 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ds27t" event={"ID":"c2d4cceb-3a17-486b-8718-897e52ea39cc","Type":"ContainerStarted","Data":"4cf0934be41d2a5d8bd9efca76dda97c0614e1877ecc9a99c33bfccf2362222d"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.379006 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tqs2b" event={"ID":"20bf2147-401b-457b-ad27-3c893be5fa2c","Type":"ContainerStarted","Data":"1959755cbf3e2aa23fb6f744b6a0b0a7d3c175ab7c17d5fa38f5c8beb96ccf63"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.395030 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v4l92" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="registry-server" containerID="cri-o://2bf7d47d8214510a9644b94fba3cd63a427233d94e0cb83dad93d7620cba1ae5" gracePeriod=2 Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.395514 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b53ad8d-06e2-4511-b33a-2ff6c6209861","Type":"ContainerStarted","Data":"f61a102841bf38515995e2f6cffcca58978e03987771085bba5cc6a28dbbe5cb"} Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.407719 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6r87r" podStartSLOduration=5.40769312 podStartE2EDuration="5.40769312s" podCreationTimestamp="2025-12-03 11:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:20.360503224 +0000 UTC m=+1544.196431688" watchObservedRunningTime="2025-12-03 11:29:20.40769312 +0000 UTC m=+1544.243621584" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.466401 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-f8jw9"] Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.479503 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-f8jw9"] Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.978062 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a7d4216-b31e-478d-9bc0-0656a1480178" path="/var/lib/kubelet/pods/6a7d4216-b31e-478d-9bc0-0656a1480178/volumes" Dec 03 11:29:20 crc kubenswrapper[4702]: I1203 11:29:20.979879 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.440154 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k8cb"] Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.441347 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9k8cb" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="registry-server" containerID="cri-o://99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4" gracePeriod=2 Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.500976 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.510058 4702 generic.go:334] "Generic (PLEG): container finished" podID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerID="22e4f4007ed632f0fe64347e59a4a025f3dbcd6bd9009a5c5e5e342e00b04022" exitCode=0 Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.510245 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"90e9786f-3e0d-4a23-b624-b49a3d386784","Type":"ContainerDied","Data":"22e4f4007ed632f0fe64347e59a4a025f3dbcd6bd9009a5c5e5e342e00b04022"} Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.522803 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4l92" event={"ID":"b14ee971-d6bc-478a-982a-998bce6d15a1","Type":"ContainerDied","Data":"2bf7d47d8214510a9644b94fba3cd63a427233d94e0cb83dad93d7620cba1ae5"} Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.523716 4702 generic.go:334] "Generic (PLEG): container finished" podID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerID="2bf7d47d8214510a9644b94fba3cd63a427233d94e0cb83dad93d7620cba1ae5" exitCode=0 Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.590819 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" event={"ID":"8d966268-b412-4449-beb2-56619ab95323","Type":"ContainerStarted","Data":"9b9781a1a444d17c459911255bbf537ac846ffe781dbb9a9784dd36cdab1fd7a"} Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.590872 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" event={"ID":"8d966268-b412-4449-beb2-56619ab95323","Type":"ContainerStarted","Data":"36201b6c0968fa32262f109e52005a1794c3a6e0a7b68773cb58db94a1b64e18"} Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.597072 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-vljtz" event={"ID":"a8812c45-da81-47ae-ba43-e181f2545cef","Type":"ContainerDied","Data":"97b046d0a9a47f0a2966ccac8f71c43a9c05163d3bf515d1dc680bc3428ff9eb"} Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.597329 4702 scope.go:117] "RemoveContainer" containerID="a8744a292b5863c272917f2f96874077ab9ca430440504bea9e8e3360bf266ba" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.597633 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-vljtz" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.642406 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-nb\") pod \"a8812c45-da81-47ae-ba43-e181f2545cef\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.642863 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-svc\") pod \"a8812c45-da81-47ae-ba43-e181f2545cef\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.643167 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s85x5\" (UniqueName: \"kubernetes.io/projected/a8812c45-da81-47ae-ba43-e181f2545cef-kube-api-access-s85x5\") pod \"a8812c45-da81-47ae-ba43-e181f2545cef\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.643334 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-swift-storage-0\") pod \"a8812c45-da81-47ae-ba43-e181f2545cef\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.643538 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-sb\") pod \"a8812c45-da81-47ae-ba43-e181f2545cef\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.643738 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-config\") pod \"a8812c45-da81-47ae-ba43-e181f2545cef\" (UID: \"a8812c45-da81-47ae-ba43-e181f2545cef\") " Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.657063 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8812c45-da81-47ae-ba43-e181f2545cef-kube-api-access-s85x5" (OuterVolumeSpecName: "kube-api-access-s85x5") pod "a8812c45-da81-47ae-ba43-e181f2545cef" (UID: "a8812c45-da81-47ae-ba43-e181f2545cef"). InnerVolumeSpecName "kube-api-access-s85x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.717378 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8812c45-da81-47ae-ba43-e181f2545cef" (UID: "a8812c45-da81-47ae-ba43-e181f2545cef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.726909 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a8812c45-da81-47ae-ba43-e181f2545cef" (UID: "a8812c45-da81-47ae-ba43-e181f2545cef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.746371 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a8812c45-da81-47ae-ba43-e181f2545cef" (UID: "a8812c45-da81-47ae-ba43-e181f2545cef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.754855 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.754899 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s85x5\" (UniqueName: \"kubernetes.io/projected/a8812c45-da81-47ae-ba43-e181f2545cef-kube-api-access-s85x5\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.754913 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.754924 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.755379 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-config" (OuterVolumeSpecName: "config") pod "a8812c45-da81-47ae-ba43-e181f2545cef" (UID: "a8812c45-da81-47ae-ba43-e181f2545cef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.768871 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a8812c45-da81-47ae-ba43-e181f2545cef" (UID: "a8812c45-da81-47ae-ba43-e181f2545cef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.857625 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:21 crc kubenswrapper[4702]: I1203 11:29:21.858002 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8812c45-da81-47ae-ba43-e181f2545cef-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:21.992825 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.084937 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-utilities\") pod \"b14ee971-d6bc-478a-982a-998bce6d15a1\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.085503 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-catalog-content\") pod \"b14ee971-d6bc-478a-982a-998bce6d15a1\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.085532 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6brr\" (UniqueName: \"kubernetes.io/projected/b14ee971-d6bc-478a-982a-998bce6d15a1-kube-api-access-q6brr\") pod \"b14ee971-d6bc-478a-982a-998bce6d15a1\" (UID: \"b14ee971-d6bc-478a-982a-998bce6d15a1\") " Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.103312 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-utilities" (OuterVolumeSpecName: "utilities") pod "b14ee971-d6bc-478a-982a-998bce6d15a1" (UID: "b14ee971-d6bc-478a-982a-998bce6d15a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.116045 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b14ee971-d6bc-478a-982a-998bce6d15a1-kube-api-access-q6brr" (OuterVolumeSpecName: "kube-api-access-q6brr") pod "b14ee971-d6bc-478a-982a-998bce6d15a1" (UID: "b14ee971-d6bc-478a-982a-998bce6d15a1"). InnerVolumeSpecName "kube-api-access-q6brr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.189213 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-vljtz"] Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.200950 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.200991 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6brr\" (UniqueName: \"kubernetes.io/projected/b14ee971-d6bc-478a-982a-998bce6d15a1-kube-api-access-q6brr\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.237608 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b14ee971-d6bc-478a-982a-998bce6d15a1" (UID: "b14ee971-d6bc-478a-982a-998bce6d15a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.250635 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-vljtz"] Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.306979 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b14ee971-d6bc-478a-982a-998bce6d15a1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.571746 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.620487 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-utilities\") pod \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.620706 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5bbw\" (UniqueName: \"kubernetes.io/projected/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-kube-api-access-g5bbw\") pod \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.622013 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-utilities" (OuterVolumeSpecName: "utilities") pod "ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" (UID: "ba3a34cc-85f8-4aa9-b88e-b582a2558b9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.622083 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-catalog-content\") pod \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\" (UID: \"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c\") " Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.626342 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.631735 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-kube-api-access-g5bbw" (OuterVolumeSpecName: "kube-api-access-g5bbw") pod "ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" (UID: "ba3a34cc-85f8-4aa9-b88e-b582a2558b9c"). InnerVolumeSpecName "kube-api-access-g5bbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.651713 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"90e9786f-3e0d-4a23-b624-b49a3d386784","Type":"ContainerStarted","Data":"c8a5edbe076c7fae36e6d71c1ba0fa39cc004e86bc1bc5274c67c0d804da550d"} Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.671775 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v4l92" event={"ID":"b14ee971-d6bc-478a-982a-998bce6d15a1","Type":"ContainerDied","Data":"bdd6afc006ffa33ee84de530bef86fe73c0f957f322990c72207ad2c7347a578"} Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.671845 4702 scope.go:117] "RemoveContainer" containerID="2bf7d47d8214510a9644b94fba3cd63a427233d94e0cb83dad93d7620cba1ae5" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.672014 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v4l92" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.682248 4702 generic.go:334] "Generic (PLEG): container finished" podID="8d966268-b412-4449-beb2-56619ab95323" containerID="9b9781a1a444d17c459911255bbf537ac846ffe781dbb9a9784dd36cdab1fd7a" exitCode=0 Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.682327 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" event={"ID":"8d966268-b412-4449-beb2-56619ab95323","Type":"ContainerDied","Data":"9b9781a1a444d17c459911255bbf537ac846ffe781dbb9a9784dd36cdab1fd7a"} Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.682356 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" event={"ID":"8d966268-b412-4449-beb2-56619ab95323","Type":"ContainerStarted","Data":"67d4be5006dd688938944229d6adc80a94eb5dcd01def0f846cf3fce0f269974"} Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.683888 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.690004 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" (UID: "ba3a34cc-85f8-4aa9-b88e-b582a2558b9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.690905 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ks2" event={"ID":"88ad8517-bb0c-42f2-9416-375284ef0db7","Type":"ContainerStarted","Data":"4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a"} Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.708995 4702 generic.go:334] "Generic (PLEG): container finished" podID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerID="99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4" exitCode=0 Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.709065 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k8cb" event={"ID":"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c","Type":"ContainerDied","Data":"99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4"} Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.709105 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9k8cb" event={"ID":"ba3a34cc-85f8-4aa9-b88e-b582a2558b9c","Type":"ContainerDied","Data":"1e46cbe54a60909661821049ed73f103e1e987b3a65f9057591f688425d5d1f9"} Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.709195 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9k8cb" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.716398 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" podStartSLOduration=6.716368161 podStartE2EDuration="6.716368161s" podCreationTimestamp="2025-12-03 11:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:22.702536159 +0000 UTC m=+1546.538464623" watchObservedRunningTime="2025-12-03 11:29:22.716368161 +0000 UTC m=+1546.552296625" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.722185 4702 scope.go:117] "RemoveContainer" containerID="7ab9702d1eba58bdcbd42289ca5f8483dd23bc3af2faf4d8218b3f9ea4135064" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.734844 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.734883 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5bbw\" (UniqueName: \"kubernetes.io/projected/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c-kube-api-access-g5bbw\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.737661 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-85ks2" podStartSLOduration=4.871369314 podStartE2EDuration="8.737639723s" podCreationTimestamp="2025-12-03 11:29:14 +0000 UTC" firstStartedPulling="2025-12-03 11:29:17.028201469 +0000 UTC m=+1540.864129933" lastFinishedPulling="2025-12-03 11:29:20.894471878 +0000 UTC m=+1544.730400342" observedRunningTime="2025-12-03 11:29:22.733428014 +0000 UTC m=+1546.569356478" watchObservedRunningTime="2025-12-03 11:29:22.737639723 +0000 UTC m=+1546.573568187" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.788803 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v4l92"] Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.806596 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v4l92"] Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.823806 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k8cb"] Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.840741 4702 scope.go:117] "RemoveContainer" containerID="c686f29a77a538f6b1cf5e7794e3b873eee6109ca2704c0f01f176cfe26474b3" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.843289 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9k8cb"] Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.900974 4702 scope.go:117] "RemoveContainer" containerID="99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.962924 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8812c45-da81-47ae-ba43-e181f2545cef" path="/var/lib/kubelet/pods/a8812c45-da81-47ae-ba43-e181f2545cef/volumes" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.964434 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" path="/var/lib/kubelet/pods/b14ee971-d6bc-478a-982a-998bce6d15a1/volumes" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.965714 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" path="/var/lib/kubelet/pods/ba3a34cc-85f8-4aa9-b88e-b582a2558b9c/volumes" Dec 03 11:29:22 crc kubenswrapper[4702]: I1203 11:29:22.985882 4702 scope.go:117] "RemoveContainer" containerID="a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3" Dec 03 11:29:23 crc kubenswrapper[4702]: I1203 11:29:23.051698 4702 scope.go:117] "RemoveContainer" containerID="c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30" Dec 03 11:29:23 crc kubenswrapper[4702]: I1203 11:29:23.118479 4702 scope.go:117] "RemoveContainer" containerID="99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4" Dec 03 11:29:23 crc kubenswrapper[4702]: E1203 11:29:23.118982 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4\": container with ID starting with 99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4 not found: ID does not exist" containerID="99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4" Dec 03 11:29:23 crc kubenswrapper[4702]: I1203 11:29:23.119024 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4"} err="failed to get container status \"99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4\": rpc error: code = NotFound desc = could not find container \"99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4\": container with ID starting with 99b55e8a69f5a88d7c475162f87cbbfdfe13d61a4768ca8ab7d83e899936e6a4 not found: ID does not exist" Dec 03 11:29:23 crc kubenswrapper[4702]: I1203 11:29:23.119059 4702 scope.go:117] "RemoveContainer" containerID="a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3" Dec 03 11:29:23 crc kubenswrapper[4702]: E1203 11:29:23.121212 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3\": container with ID starting with a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3 not found: ID does not exist" containerID="a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3" Dec 03 11:29:23 crc kubenswrapper[4702]: I1203 11:29:23.121260 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3"} err="failed to get container status \"a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3\": rpc error: code = NotFound desc = could not find container \"a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3\": container with ID starting with a0841d7b4a957c4c389834c52e4259e2bc2af79cce72e3fd7de21a0f5d1743a3 not found: ID does not exist" Dec 03 11:29:23 crc kubenswrapper[4702]: I1203 11:29:23.121301 4702 scope.go:117] "RemoveContainer" containerID="c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30" Dec 03 11:29:23 crc kubenswrapper[4702]: E1203 11:29:23.122389 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30\": container with ID starting with c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30 not found: ID does not exist" containerID="c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30" Dec 03 11:29:23 crc kubenswrapper[4702]: I1203 11:29:23.122430 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30"} err="failed to get container status \"c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30\": rpc error: code = NotFound desc = could not find container \"c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30\": container with ID starting with c492d152ed82870fbf4615e4506b014917f34cfcba2b1349c800dd9242e7cb30 not found: ID does not exist" Dec 03 11:29:24 crc kubenswrapper[4702]: I1203 11:29:24.548411 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:24 crc kubenswrapper[4702]: I1203 11:29:24.548838 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:24 crc kubenswrapper[4702]: I1203 11:29:24.676334 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:26 crc kubenswrapper[4702]: I1203 11:29:26.869913 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"90e9786f-3e0d-4a23-b624-b49a3d386784","Type":"ContainerStarted","Data":"156566ea32d1e2a537fc887baef13406af689ba5b1c54d16092e72a0770460ea"} Dec 03 11:29:27 crc kubenswrapper[4702]: I1203 11:29:27.886446 4702 generic.go:334] "Generic (PLEG): container finished" podID="4dcfb963-e8b9-4053-8de9-b1d127e6abfa" containerID="6d0ecece3d77561615396eefaec02567322e8c7aff8df432d1ee6b49f4aaa42b" exitCode=0 Dec 03 11:29:27 crc kubenswrapper[4702]: I1203 11:29:27.886676 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6r87r" event={"ID":"4dcfb963-e8b9-4053-8de9-b1d127e6abfa","Type":"ContainerDied","Data":"6d0ecece3d77561615396eefaec02567322e8c7aff8df432d1ee6b49f4aaa42b"} Dec 03 11:29:27 crc kubenswrapper[4702]: I1203 11:29:27.897039 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:29:27 crc kubenswrapper[4702]: I1203 11:29:27.994528 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vsswz"] Dec 03 11:29:27 crc kubenswrapper[4702]: I1203 11:29:27.994910 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-vsswz" podUID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerName="dnsmasq-dns" containerID="cri-o://ad7cabe93a2e69ec27f0611b2889856e663aa31a9cd3777405c044cfaf455192" gracePeriod=10 Dec 03 11:29:28 crc kubenswrapper[4702]: I1203 11:29:28.914911 4702 generic.go:334] "Generic (PLEG): container finished" podID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerID="ad7cabe93a2e69ec27f0611b2889856e663aa31a9cd3777405c044cfaf455192" exitCode=0 Dec 03 11:29:28 crc kubenswrapper[4702]: I1203 11:29:28.915063 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vsswz" event={"ID":"07456a7d-9f55-45e0-a920-d6cf1d092c0c","Type":"ContainerDied","Data":"ad7cabe93a2e69ec27f0611b2889856e663aa31a9cd3777405c044cfaf455192"} Dec 03 11:29:28 crc kubenswrapper[4702]: I1203 11:29:28.923006 4702 generic.go:334] "Generic (PLEG): container finished" podID="34d869ae-eae9-4bf6-b05e-cf40504ccdb6" containerID="e0cb0c5f34421fd7d055849726b234e30a1295b20d66bf6adc32dd351bc8adbe" exitCode=0 Dec 03 11:29:28 crc kubenswrapper[4702]: I1203 11:29:28.923304 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hq9rm" event={"ID":"34d869ae-eae9-4bf6-b05e-cf40504ccdb6","Type":"ContainerDied","Data":"e0cb0c5f34421fd7d055849726b234e30a1295b20d66bf6adc32dd351bc8adbe"} Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.642651 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.657066 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hq9rm" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.768475 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-nb\") pod \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.769769 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhj4j\" (UniqueName: \"kubernetes.io/projected/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-kube-api-access-lhj4j\") pod \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.769870 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-combined-ca-bundle\") pod \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.769952 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-config\") pod \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.769983 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-config-data\") pod \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.770058 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-dns-svc\") pod \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.770109 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-db-sync-config-data\") pod \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\" (UID: \"34d869ae-eae9-4bf6-b05e-cf40504ccdb6\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.770158 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-sb\") pod \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.770196 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-749hj\" (UniqueName: \"kubernetes.io/projected/07456a7d-9f55-45e0-a920-d6cf1d092c0c-kube-api-access-749hj\") pod \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\" (UID: \"07456a7d-9f55-45e0-a920-d6cf1d092c0c\") " Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.777285 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "34d869ae-eae9-4bf6-b05e-cf40504ccdb6" (UID: "34d869ae-eae9-4bf6-b05e-cf40504ccdb6"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.778389 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-kube-api-access-lhj4j" (OuterVolumeSpecName: "kube-api-access-lhj4j") pod "34d869ae-eae9-4bf6-b05e-cf40504ccdb6" (UID: "34d869ae-eae9-4bf6-b05e-cf40504ccdb6"). InnerVolumeSpecName "kube-api-access-lhj4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.780493 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07456a7d-9f55-45e0-a920-d6cf1d092c0c-kube-api-access-749hj" (OuterVolumeSpecName: "kube-api-access-749hj") pod "07456a7d-9f55-45e0-a920-d6cf1d092c0c" (UID: "07456a7d-9f55-45e0-a920-d6cf1d092c0c"). InnerVolumeSpecName "kube-api-access-749hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.807951 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34d869ae-eae9-4bf6-b05e-cf40504ccdb6" (UID: "34d869ae-eae9-4bf6-b05e-cf40504ccdb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.834171 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07456a7d-9f55-45e0-a920-d6cf1d092c0c" (UID: "07456a7d-9f55-45e0-a920-d6cf1d092c0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.834693 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-config-data" (OuterVolumeSpecName: "config-data") pod "34d869ae-eae9-4bf6-b05e-cf40504ccdb6" (UID: "34d869ae-eae9-4bf6-b05e-cf40504ccdb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.835408 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07456a7d-9f55-45e0-a920-d6cf1d092c0c" (UID: "07456a7d-9f55-45e0-a920-d6cf1d092c0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.843273 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07456a7d-9f55-45e0-a920-d6cf1d092c0c" (UID: "07456a7d-9f55-45e0-a920-d6cf1d092c0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.853689 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-config" (OuterVolumeSpecName: "config") pod "07456a7d-9f55-45e0-a920-d6cf1d092c0c" (UID: "07456a7d-9f55-45e0-a920-d6cf1d092c0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875181 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875257 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875276 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875289 4702 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875314 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875329 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-749hj\" (UniqueName: \"kubernetes.io/projected/07456a7d-9f55-45e0-a920-d6cf1d092c0c-kube-api-access-749hj\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875345 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07456a7d-9f55-45e0-a920-d6cf1d092c0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875364 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhj4j\" (UniqueName: \"kubernetes.io/projected/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-kube-api-access-lhj4j\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:33 crc kubenswrapper[4702]: I1203 11:29:33.875375 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34d869ae-eae9-4bf6-b05e-cf40504ccdb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.001473 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-vsswz" event={"ID":"07456a7d-9f55-45e0-a920-d6cf1d092c0c","Type":"ContainerDied","Data":"d2cf77579eaaddc260551a4b925c52c83b25943033b74f2b37267d84fb3bb26c"} Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.001512 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-vsswz" Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.001550 4702 scope.go:117] "RemoveContainer" containerID="ad7cabe93a2e69ec27f0611b2889856e663aa31a9cd3777405c044cfaf455192" Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.007120 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hq9rm" event={"ID":"34d869ae-eae9-4bf6-b05e-cf40504ccdb6","Type":"ContainerDied","Data":"8d478e150cefee053fd0fab217dda46aeb338f521b07fcbe667c939d0bd9d2f3"} Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.007167 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d478e150cefee053fd0fab217dda46aeb338f521b07fcbe667c939d0bd9d2f3" Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.007183 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hq9rm" Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.054916 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vsswz"] Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.069569 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-vsswz"] Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.586448 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.646319 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-85ks2"] Dec 03 11:29:34 crc kubenswrapper[4702]: I1203 11:29:34.956480 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" path="/var/lib/kubelet/pods/07456a7d-9f55-45e0-a920-d6cf1d092c0c/volumes" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.025349 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-85ks2" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="registry-server" containerID="cri-o://4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a" gracePeriod=2 Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.173979 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pd52h"] Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174502 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="registry-server" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174522 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="registry-server" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174538 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="extract-utilities" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174547 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="extract-utilities" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174557 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a7d4216-b31e-478d-9bc0-0656a1480178" containerName="init" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174563 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a7d4216-b31e-478d-9bc0-0656a1480178" containerName="init" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174578 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="extract-content" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174584 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="extract-content" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174605 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="extract-utilities" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174611 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="extract-utilities" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174624 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="extract-content" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174629 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="extract-content" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174638 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8812c45-da81-47ae-ba43-e181f2545cef" containerName="init" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174644 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8812c45-da81-47ae-ba43-e181f2545cef" containerName="init" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174658 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerName="init" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174664 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerName="init" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174674 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerName="dnsmasq-dns" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174680 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerName="dnsmasq-dns" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174691 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="registry-server" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174697 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="registry-server" Dec 03 11:29:35 crc kubenswrapper[4702]: E1203 11:29:35.174708 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d869ae-eae9-4bf6-b05e-cf40504ccdb6" containerName="glance-db-sync" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174714 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d869ae-eae9-4bf6-b05e-cf40504ccdb6" containerName="glance-db-sync" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174957 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerName="dnsmasq-dns" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174974 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a7d4216-b31e-478d-9bc0-0656a1480178" containerName="init" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174982 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8812c45-da81-47ae-ba43-e181f2545cef" containerName="init" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.174997 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="b14ee971-d6bc-478a-982a-998bce6d15a1" containerName="registry-server" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.175012 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d869ae-eae9-4bf6-b05e-cf40504ccdb6" containerName="glance-db-sync" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.175023 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3a34cc-85f8-4aa9-b88e-b582a2558b9c" containerName="registry-server" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.176387 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.221691 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pd52h"] Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.262097 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-vsswz" podUID="07456a7d-9f55-45e0-a920-d6cf1d092c0c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.308969 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.309094 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-config\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.309145 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.309184 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.309212 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.309254 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pgwq\" (UniqueName: \"kubernetes.io/projected/8d77efc6-ba78-4be8-a834-5b677bf05631-kube-api-access-7pgwq\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.411571 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.412026 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-config\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.412204 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.412361 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.412511 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.412638 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pgwq\" (UniqueName: \"kubernetes.io/projected/8d77efc6-ba78-4be8-a834-5b677bf05631-kube-api-access-7pgwq\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.413073 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.413246 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.413308 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.413407 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.413846 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-config\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.441261 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pgwq\" (UniqueName: \"kubernetes.io/projected/8d77efc6-ba78-4be8-a834-5b677bf05631-kube-api-access-7pgwq\") pod \"dnsmasq-dns-785d8bcb8c-pd52h\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:35 crc kubenswrapper[4702]: I1203 11:29:35.507609 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.055671 4702 generic.go:334] "Generic (PLEG): container finished" podID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerID="4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a" exitCode=0 Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.056030 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ks2" event={"ID":"88ad8517-bb0c-42f2-9416-375284ef0db7","Type":"ContainerDied","Data":"4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a"} Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.065160 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.067580 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.070496 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-wngfr" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.070787 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.083092 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.088653 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.240391 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.240462 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.241059 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.241182 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.241254 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xt6d\" (UniqueName: \"kubernetes.io/projected/2d8b3544-ab03-4842-9feb-fb3164cd3808-kube-api-access-5xt6d\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.241331 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.241431 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-logs\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.343948 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.344030 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-logs\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.344138 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.344177 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.344294 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.344331 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.344360 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xt6d\" (UniqueName: \"kubernetes.io/projected/2d8b3544-ab03-4842-9feb-fb3164cd3808-kube-api-access-5xt6d\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.344843 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.344998 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.349750 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-logs\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.351808 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.354570 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-scripts\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.355046 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-config-data\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.370623 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xt6d\" (UniqueName: \"kubernetes.io/projected/2d8b3544-ab03-4842-9feb-fb3164cd3808-kube-api-access-5xt6d\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.380419 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.400422 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.402972 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.403622 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.406320 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.415868 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.549059 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.549178 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.549211 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.549980 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.550053 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.550100 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.550164 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hcdn\" (UniqueName: \"kubernetes.io/projected/eff669e5-d71e-43d2-a59d-9410c8fd46cf-kube-api-access-8hcdn\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.653168 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.653244 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.653280 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.653337 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hcdn\" (UniqueName: \"kubernetes.io/projected/eff669e5-d71e-43d2-a59d-9410c8fd46cf-kube-api-access-8hcdn\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.653408 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.653480 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.653497 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.653533 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.654234 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.655520 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-logs\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.660257 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.672035 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.690165 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.703122 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hcdn\" (UniqueName: \"kubernetes.io/projected/eff669e5-d71e-43d2-a59d-9410c8fd46cf-kube-api-access-8hcdn\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.865054 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:36 crc kubenswrapper[4702]: I1203 11:29:36.975488 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.067435 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-scripts\") pod \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.068169 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-fernet-keys\") pod \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.068341 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-combined-ca-bundle\") pod \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.068474 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x54r5\" (UniqueName: \"kubernetes.io/projected/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-kube-api-access-x54r5\") pod \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.068736 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-config-data\") pod \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.068869 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-credential-keys\") pod \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\" (UID: \"4dcfb963-e8b9-4053-8de9-b1d127e6abfa\") " Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.088517 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6r87r" event={"ID":"4dcfb963-e8b9-4053-8de9-b1d127e6abfa","Type":"ContainerDied","Data":"0eb02e16f1ab1ecc0d36180ead20ce29af9cc40f62cb4f36a753b67e07f996a3"} Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.088582 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0eb02e16f1ab1ecc0d36180ead20ce29af9cc40f62cb4f36a753b67e07f996a3" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.089417 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6r87r" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.089932 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-scripts" (OuterVolumeSpecName: "scripts") pod "4dcfb963-e8b9-4053-8de9-b1d127e6abfa" (UID: "4dcfb963-e8b9-4053-8de9-b1d127e6abfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.091928 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.098434 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-kube-api-access-x54r5" (OuterVolumeSpecName: "kube-api-access-x54r5") pod "4dcfb963-e8b9-4053-8de9-b1d127e6abfa" (UID: "4dcfb963-e8b9-4053-8de9-b1d127e6abfa"). InnerVolumeSpecName "kube-api-access-x54r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.121121 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4dcfb963-e8b9-4053-8de9-b1d127e6abfa" (UID: "4dcfb963-e8b9-4053-8de9-b1d127e6abfa"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.121687 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4dcfb963-e8b9-4053-8de9-b1d127e6abfa" (UID: "4dcfb963-e8b9-4053-8de9-b1d127e6abfa"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.135587 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-config-data" (OuterVolumeSpecName: "config-data") pod "4dcfb963-e8b9-4053-8de9-b1d127e6abfa" (UID: "4dcfb963-e8b9-4053-8de9-b1d127e6abfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.137647 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dcfb963-e8b9-4053-8de9-b1d127e6abfa" (UID: "4dcfb963-e8b9-4053-8de9-b1d127e6abfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.172862 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.172920 4702 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.172938 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.172969 4702 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.172981 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:37 crc kubenswrapper[4702]: I1203 11:29:37.172994 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x54r5\" (UniqueName: \"kubernetes.io/projected/4dcfb963-e8b9-4053-8de9-b1d127e6abfa-kube-api-access-x54r5\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:37 crc kubenswrapper[4702]: E1203 11:29:37.750491 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 03 11:29:37 crc kubenswrapper[4702]: E1203 11:29:37.751008 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4x6vj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-ds27t_openstack(c2d4cceb-3a17-486b-8718-897e52ea39cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:29:37 crc kubenswrapper[4702]: E1203 11:29:37.752174 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-ds27t" podUID="c2d4cceb-3a17-486b-8718-897e52ea39cc" Dec 03 11:29:38 crc kubenswrapper[4702]: E1203 11:29:38.117868 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-ds27t" podUID="c2d4cceb-3a17-486b-8718-897e52ea39cc" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.178419 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6r87r"] Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.199310 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6r87r"] Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.258376 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r7v59"] Dec 03 11:29:38 crc kubenswrapper[4702]: E1203 11:29:38.260714 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dcfb963-e8b9-4053-8de9-b1d127e6abfa" containerName="keystone-bootstrap" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.260851 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dcfb963-e8b9-4053-8de9-b1d127e6abfa" containerName="keystone-bootstrap" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.261267 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dcfb963-e8b9-4053-8de9-b1d127e6abfa" containerName="keystone-bootstrap" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.262327 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.266302 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.266331 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnt45" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.268251 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.269043 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.269188 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.286698 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r7v59"] Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.400675 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.403198 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-fernet-keys\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.403314 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7p7g\" (UniqueName: \"kubernetes.io/projected/65cdf028-89d6-4f6e-8aae-bdf5e8264310-kube-api-access-v7p7g\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.403396 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-config-data\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.403477 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-combined-ca-bundle\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.403590 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-credential-keys\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.403649 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-scripts\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.502254 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.505643 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-scripts\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.505770 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-fernet-keys\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.505871 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7p7g\" (UniqueName: \"kubernetes.io/projected/65cdf028-89d6-4f6e-8aae-bdf5e8264310-kube-api-access-v7p7g\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.505965 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-config-data\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.506044 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-combined-ca-bundle\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.506177 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-credential-keys\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.513241 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-scripts\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.513773 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-fernet-keys\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.514238 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-credential-keys\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.514867 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-config-data\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.515030 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-combined-ca-bundle\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.529317 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7p7g\" (UniqueName: \"kubernetes.io/projected/65cdf028-89d6-4f6e-8aae-bdf5e8264310-kube-api-access-v7p7g\") pod \"keystone-bootstrap-r7v59\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.595850 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:29:38 crc kubenswrapper[4702]: I1203 11:29:38.941330 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dcfb963-e8b9-4053-8de9-b1d127e6abfa" path="/var/lib/kubelet/pods/4dcfb963-e8b9-4053-8de9-b1d127e6abfa/volumes" Dec 03 11:29:44 crc kubenswrapper[4702]: E1203 11:29:44.531906 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a is running failed: container process not found" containerID="4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 11:29:44 crc kubenswrapper[4702]: E1203 11:29:44.534005 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a is running failed: container process not found" containerID="4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 11:29:44 crc kubenswrapper[4702]: E1203 11:29:44.534448 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a is running failed: container process not found" containerID="4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 11:29:44 crc kubenswrapper[4702]: E1203 11:29:44.534484 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-85ks2" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="registry-server" Dec 03 11:29:50 crc kubenswrapper[4702]: E1203 11:29:50.929629 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 03 11:29:50 crc kubenswrapper[4702]: E1203 11:29:50.930385 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5qhx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-wjjt9_openstack(6c60c306-2c56-44e4-8482-e5a72eccd765): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:29:50 crc kubenswrapper[4702]: E1203 11:29:50.931657 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-wjjt9" podUID="6c60c306-2c56-44e4-8482-e5a72eccd765" Dec 03 11:29:51 crc kubenswrapper[4702]: E1203 11:29:51.258990 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 03 11:29:51 crc kubenswrapper[4702]: E1203 11:29:51.259476 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ncfh6h659hc5h586h686h5d7hdbh679h554h595h585h5ddhch5c7h567hcfh57bh5b5h5dfh55ch64hf9h595h566h594h5d8h58bh68fh56dh647h5f4q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-msk6c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(4b53ad8d-06e2-4511-b33a-2ff6c6209861): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:29:51 crc kubenswrapper[4702]: E1203 11:29:51.267135 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-wjjt9" podUID="6c60c306-2c56-44e4-8482-e5a72eccd765" Dec 03 11:29:52 crc kubenswrapper[4702]: E1203 11:29:52.662278 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 03 11:29:52 crc kubenswrapper[4702]: E1203 11:29:52.662816 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p58fb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-tqs2b_openstack(20bf2147-401b-457b-ad27-3c893be5fa2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:29:52 crc kubenswrapper[4702]: E1203 11:29:52.664114 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-tqs2b" podUID="20bf2147-401b-457b-ad27-3c893be5fa2c" Dec 03 11:29:52 crc kubenswrapper[4702]: I1203 11:29:52.678258 4702 scope.go:117] "RemoveContainer" containerID="d8b5cfb6ce58911227d6d7a979a32b1c57ea50c992814dc47435900ca724b341" Dec 03 11:29:52 crc kubenswrapper[4702]: I1203 11:29:52.997489 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.126989 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4rkk\" (UniqueName: \"kubernetes.io/projected/88ad8517-bb0c-42f2-9416-375284ef0db7-kube-api-access-t4rkk\") pod \"88ad8517-bb0c-42f2-9416-375284ef0db7\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.127097 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-utilities\") pod \"88ad8517-bb0c-42f2-9416-375284ef0db7\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.127188 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-catalog-content\") pod \"88ad8517-bb0c-42f2-9416-375284ef0db7\" (UID: \"88ad8517-bb0c-42f2-9416-375284ef0db7\") " Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.128927 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-utilities" (OuterVolumeSpecName: "utilities") pod "88ad8517-bb0c-42f2-9416-375284ef0db7" (UID: "88ad8517-bb0c-42f2-9416-375284ef0db7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.136485 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88ad8517-bb0c-42f2-9416-375284ef0db7-kube-api-access-t4rkk" (OuterVolumeSpecName: "kube-api-access-t4rkk") pod "88ad8517-bb0c-42f2-9416-375284ef0db7" (UID: "88ad8517-bb0c-42f2-9416-375284ef0db7"). InnerVolumeSpecName "kube-api-access-t4rkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.180939 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88ad8517-bb0c-42f2-9416-375284ef0db7" (UID: "88ad8517-bb0c-42f2-9416-375284ef0db7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.234050 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.234419 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4rkk\" (UniqueName: \"kubernetes.io/projected/88ad8517-bb0c-42f2-9416-375284ef0db7-kube-api-access-t4rkk\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.234440 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88ad8517-bb0c-42f2-9416-375284ef0db7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.297973 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jtqd2" event={"ID":"0ca96186-ab1f-41b3-a9f4-c89220b757da","Type":"ContainerStarted","Data":"5509dd1193f09ef67b047350644f34200351689344ac2325ad460db94c52caf3"} Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.321789 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"90e9786f-3e0d-4a23-b624-b49a3d386784","Type":"ContainerStarted","Data":"730a8b3fa4a563377a9dbf10b62c27bbff1ef2c982037adb4c868520fc1d6910"} Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.325064 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jtqd2" podStartSLOduration=4.770067284 podStartE2EDuration="37.325030046s" podCreationTimestamp="2025-12-03 11:29:16 +0000 UTC" firstStartedPulling="2025-12-03 11:29:20.123518201 +0000 UTC m=+1543.959446665" lastFinishedPulling="2025-12-03 11:29:52.678480963 +0000 UTC m=+1576.514409427" observedRunningTime="2025-12-03 11:29:53.318992855 +0000 UTC m=+1577.154921329" watchObservedRunningTime="2025-12-03 11:29:53.325030046 +0000 UTC m=+1577.160958510" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.327951 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-85ks2" event={"ID":"88ad8517-bb0c-42f2-9416-375284ef0db7","Type":"ContainerDied","Data":"47df90550d3fb59666d91120d11a49449c43ddc4583c7636051c0abcbf23a884"} Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.328128 4702 scope.go:117] "RemoveContainer" containerID="4119317f0f4e1bc510413181932e06afd7b2aa098c00562dff23b9a8225d1c9a" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.328126 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-85ks2" Dec 03 11:29:53 crc kubenswrapper[4702]: E1203 11:29:53.333098 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-tqs2b" podUID="20bf2147-401b-457b-ad27-3c893be5fa2c" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.370097 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=70.370020911 podStartE2EDuration="1m10.370020911s" podCreationTimestamp="2025-12-03 11:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:53.356991312 +0000 UTC m=+1577.192919796" watchObservedRunningTime="2025-12-03 11:29:53.370020911 +0000 UTC m=+1577.205949375" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.418254 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-85ks2"] Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.433889 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-85ks2"] Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.522461 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pd52h"] Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.701571 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.757701 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r7v59"] Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.818980 4702 scope.go:117] "RemoveContainer" containerID="d1fb32ddc434944fdbba7cf06527f6e968de20e318c1a12e550b9ffa0cafe39b" Dec 03 11:29:53 crc kubenswrapper[4702]: I1203 11:29:53.870269 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.070363 4702 scope.go:117] "RemoveContainer" containerID="b6cb2c393db156cfe6ef69e7d47ef8d29efb37ead84f5c7662dae50e7c797d69" Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.347592 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff669e5-d71e-43d2-a59d-9410c8fd46cf","Type":"ContainerStarted","Data":"0461c499aea97f5a1097d2e986c8f17951725bf045177d1efff974340950ac1c"} Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.354010 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ds27t" event={"ID":"c2d4cceb-3a17-486b-8718-897e52ea39cc","Type":"ContainerStarted","Data":"f4c7366329e9fbc8693b7c9cda24f6a5889f0b5aaba9acb1380cd579109f0f7f"} Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.356942 4702 generic.go:334] "Generic (PLEG): container finished" podID="32433fde-8c4c-43af-94f5-ab732096cd9d" containerID="cc7bab6c4bcdb67bd28ac08cc403f78ca5b1748a0cc502b406e0ff94a7278db1" exitCode=0 Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.357042 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w8wgv" event={"ID":"32433fde-8c4c-43af-94f5-ab732096cd9d","Type":"ContainerDied","Data":"cc7bab6c4bcdb67bd28ac08cc403f78ca5b1748a0cc502b406e0ff94a7278db1"} Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.359829 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" event={"ID":"8d77efc6-ba78-4be8-a834-5b677bf05631","Type":"ContainerStarted","Data":"5d68dbfd6658e32f389d6c0afd68ba1a49bfa81e1beca09ee07d84df1dacc010"} Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.364135 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r7v59" event={"ID":"65cdf028-89d6-4f6e-8aae-bdf5e8264310","Type":"ContainerStarted","Data":"b5d201e7dc8fa636053079934b99bf7c8516891499fc59b7d84422128af1c119"} Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.380808 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-ds27t" podStartSLOduration=4.957597886 podStartE2EDuration="38.38077892s" podCreationTimestamp="2025-12-03 11:29:16 +0000 UTC" firstStartedPulling="2025-12-03 11:29:19.439990091 +0000 UTC m=+1543.275918555" lastFinishedPulling="2025-12-03 11:29:52.863171125 +0000 UTC m=+1576.699099589" observedRunningTime="2025-12-03 11:29:54.375277104 +0000 UTC m=+1578.211205578" watchObservedRunningTime="2025-12-03 11:29:54.38077892 +0000 UTC m=+1578.216707384" Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.955571 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" path="/var/lib/kubelet/pods/88ad8517-bb0c-42f2-9416-375284ef0db7/volumes" Dec 03 11:29:54 crc kubenswrapper[4702]: I1203 11:29:54.956874 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:29:54 crc kubenswrapper[4702]: W1203 11:29:54.962110 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d8b3544_ab03_4842_9feb_fb3164cd3808.slice/crio-610e9314887518fd70e5de6180856c31a2aad426980fd5017693ef3b6f37b510 WatchSource:0}: Error finding container 610e9314887518fd70e5de6180856c31a2aad426980fd5017693ef3b6f37b510: Status 404 returned error can't find the container with id 610e9314887518fd70e5de6180856c31a2aad426980fd5017693ef3b6f37b510 Dec 03 11:29:55 crc kubenswrapper[4702]: I1203 11:29:55.381477 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d8b3544-ab03-4842-9feb-fb3164cd3808","Type":"ContainerStarted","Data":"610e9314887518fd70e5de6180856c31a2aad426980fd5017693ef3b6f37b510"} Dec 03 11:29:55 crc kubenswrapper[4702]: I1203 11:29:55.388501 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r7v59" event={"ID":"65cdf028-89d6-4f6e-8aae-bdf5e8264310","Type":"ContainerStarted","Data":"c68dde3062d8980ea81d1539226a9d5aa4e10038b93c8095bc55487691b6bf57"} Dec 03 11:29:55 crc kubenswrapper[4702]: I1203 11:29:55.391626 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff669e5-d71e-43d2-a59d-9410c8fd46cf","Type":"ContainerStarted","Data":"3bc32009e08aba03f2c9010f1165d45f61c7a85efc7d78e7145370423bba77ca"} Dec 03 11:29:55 crc kubenswrapper[4702]: I1203 11:29:55.394261 4702 generic.go:334] "Generic (PLEG): container finished" podID="8d77efc6-ba78-4be8-a834-5b677bf05631" containerID="cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376" exitCode=0 Dec 03 11:29:55 crc kubenswrapper[4702]: I1203 11:29:55.394298 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" event={"ID":"8d77efc6-ba78-4be8-a834-5b677bf05631","Type":"ContainerDied","Data":"cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376"} Dec 03 11:29:55 crc kubenswrapper[4702]: I1203 11:29:55.396735 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b53ad8d-06e2-4511-b33a-2ff6c6209861","Type":"ContainerStarted","Data":"c11aaf3fec02583298058b1bab0d56c335f1b17f97e32174a1c075be4b5f5565"} Dec 03 11:29:55 crc kubenswrapper[4702]: I1203 11:29:55.414031 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r7v59" podStartSLOduration=17.414000164 podStartE2EDuration="17.414000164s" podCreationTimestamp="2025-12-03 11:29:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:55.408237611 +0000 UTC m=+1579.244166075" watchObservedRunningTime="2025-12-03 11:29:55.414000164 +0000 UTC m=+1579.249928628" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.171324 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.220282 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5n4w\" (UniqueName: \"kubernetes.io/projected/32433fde-8c4c-43af-94f5-ab732096cd9d-kube-api-access-h5n4w\") pod \"32433fde-8c4c-43af-94f5-ab732096cd9d\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.220506 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-combined-ca-bundle\") pod \"32433fde-8c4c-43af-94f5-ab732096cd9d\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.220601 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-config\") pod \"32433fde-8c4c-43af-94f5-ab732096cd9d\" (UID: \"32433fde-8c4c-43af-94f5-ab732096cd9d\") " Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.227267 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32433fde-8c4c-43af-94f5-ab732096cd9d-kube-api-access-h5n4w" (OuterVolumeSpecName: "kube-api-access-h5n4w") pod "32433fde-8c4c-43af-94f5-ab732096cd9d" (UID: "32433fde-8c4c-43af-94f5-ab732096cd9d"). InnerVolumeSpecName "kube-api-access-h5n4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.371247 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-config" (OuterVolumeSpecName: "config") pod "32433fde-8c4c-43af-94f5-ab732096cd9d" (UID: "32433fde-8c4c-43af-94f5-ab732096cd9d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.378158 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.378213 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5n4w\" (UniqueName: \"kubernetes.io/projected/32433fde-8c4c-43af-94f5-ab732096cd9d-kube-api-access-h5n4w\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.399057 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32433fde-8c4c-43af-94f5-ab732096cd9d" (UID: "32433fde-8c4c-43af-94f5-ab732096cd9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.435663 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff669e5-d71e-43d2-a59d-9410c8fd46cf","Type":"ContainerStarted","Data":"9250b2e41e11df6f5cdb701470e7ff7c742625e143c09a2a6d196efae91e5a12"} Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.435893 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerName="glance-log" containerID="cri-o://3bc32009e08aba03f2c9010f1165d45f61c7a85efc7d78e7145370423bba77ca" gracePeriod=30 Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.435988 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerName="glance-httpd" containerID="cri-o://9250b2e41e11df6f5cdb701470e7ff7c742625e143c09a2a6d196efae91e5a12" gracePeriod=30 Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.439320 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w8wgv" event={"ID":"32433fde-8c4c-43af-94f5-ab732096cd9d","Type":"ContainerDied","Data":"ade55c30b236ea1604802a84c6aacecb6c988cd8ce4d073ac306e188a924e494"} Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.439370 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ade55c30b236ea1604802a84c6aacecb6c988cd8ce4d073ac306e188a924e494" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.439248 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w8wgv" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.445930 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d8b3544-ab03-4842-9feb-fb3164cd3808","Type":"ContainerStarted","Data":"3f2a0b14c48cd2331ce1a782d6330fde86ea22cf974d408d8eecd1f70ab57c6b"} Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.477968 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=21.4779457 podStartE2EDuration="21.4779457s" podCreationTimestamp="2025-12-03 11:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:56.455903416 +0000 UTC m=+1580.291831910" watchObservedRunningTime="2025-12-03 11:29:56.4779457 +0000 UTC m=+1580.313874164" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.480958 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32433fde-8c4c-43af-94f5-ab732096cd9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.655805 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pd52h"] Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.691651 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-v6tdw"] Dec 03 11:29:56 crc kubenswrapper[4702]: E1203 11:29:56.692324 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="extract-content" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.692354 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="extract-content" Dec 03 11:29:56 crc kubenswrapper[4702]: E1203 11:29:56.692391 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="registry-server" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.692401 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="registry-server" Dec 03 11:29:56 crc kubenswrapper[4702]: E1203 11:29:56.692421 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="extract-utilities" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.692431 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="extract-utilities" Dec 03 11:29:56 crc kubenswrapper[4702]: E1203 11:29:56.692458 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32433fde-8c4c-43af-94f5-ab732096cd9d" containerName="neutron-db-sync" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.692488 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="32433fde-8c4c-43af-94f5-ab732096cd9d" containerName="neutron-db-sync" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.692690 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="88ad8517-bb0c-42f2-9416-375284ef0db7" containerName="registry-server" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.692705 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="32433fde-8c4c-43af-94f5-ab732096cd9d" containerName="neutron-db-sync" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.694100 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.817508 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-v6tdw"] Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.876824 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5cb45ddb68-kntb7"] Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.884418 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.893589 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.893939 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-72dgz" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.894169 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.894324 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.919586 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.919670 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.919852 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-config\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.920030 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2qw6\" (UniqueName: \"kubernetes.io/projected/98ff257a-9df9-4798-a184-af395ffa6b2e-kube-api-access-c2qw6\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.920137 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.921564 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:56 crc kubenswrapper[4702]: I1203 11:29:56.957022 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cb45ddb68-kntb7"] Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026072 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.024924 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026165 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026241 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-ovndb-tls-certs\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026306 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-combined-ca-bundle\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026355 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-config\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026375 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdvcn\" (UniqueName: \"kubernetes.io/projected/f2c41a10-02a8-438d-a25c-18e9caf9e467-kube-api-access-cdvcn\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026394 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-config\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026488 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2qw6\" (UniqueName: \"kubernetes.io/projected/98ff257a-9df9-4798-a184-af395ffa6b2e-kube-api-access-c2qw6\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026519 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026595 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-httpd-config\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.026722 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.029487 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-svc\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.030293 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.030483 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-config\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.030702 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.080280 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2qw6\" (UniqueName: \"kubernetes.io/projected/98ff257a-9df9-4798-a184-af395ffa6b2e-kube-api-access-c2qw6\") pod \"dnsmasq-dns-55f844cf75-v6tdw\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.137590 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-httpd-config\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.137957 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-ovndb-tls-certs\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.137999 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-combined-ca-bundle\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.138041 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdvcn\" (UniqueName: \"kubernetes.io/projected/f2c41a10-02a8-438d-a25c-18e9caf9e467-kube-api-access-cdvcn\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.138059 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-config\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.146852 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-ovndb-tls-certs\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.147513 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-combined-ca-bundle\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.148393 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-httpd-config\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.148915 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-config\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.159892 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdvcn\" (UniqueName: \"kubernetes.io/projected/f2c41a10-02a8-438d-a25c-18e9caf9e467-kube-api-access-cdvcn\") pod \"neutron-5cb45ddb68-kntb7\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.379803 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.480222 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.587300 4702 generic.go:334] "Generic (PLEG): container finished" podID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerID="9250b2e41e11df6f5cdb701470e7ff7c742625e143c09a2a6d196efae91e5a12" exitCode=143 Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.587344 4702 generic.go:334] "Generic (PLEG): container finished" podID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerID="3bc32009e08aba03f2c9010f1165d45f61c7a85efc7d78e7145370423bba77ca" exitCode=143 Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.587421 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff669e5-d71e-43d2-a59d-9410c8fd46cf","Type":"ContainerDied","Data":"9250b2e41e11df6f5cdb701470e7ff7c742625e143c09a2a6d196efae91e5a12"} Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.587455 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff669e5-d71e-43d2-a59d-9410c8fd46cf","Type":"ContainerDied","Data":"3bc32009e08aba03f2c9010f1165d45f61c7a85efc7d78e7145370423bba77ca"} Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.622854 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" event={"ID":"8d77efc6-ba78-4be8-a834-5b677bf05631","Type":"ContainerStarted","Data":"4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c"} Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.624818 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.644398 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d8b3544-ab03-4842-9feb-fb3164cd3808","Type":"ContainerStarted","Data":"698d980c72df0c7c92dbff1f5609a6ff7652555258443f6e5ab882b5ae971cda"} Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.644589 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerName="glance-log" containerID="cri-o://3f2a0b14c48cd2331ce1a782d6330fde86ea22cf974d408d8eecd1f70ab57c6b" gracePeriod=30 Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.644696 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerName="glance-httpd" containerID="cri-o://698d980c72df0c7c92dbff1f5609a6ff7652555258443f6e5ab882b5ae971cda" gracePeriod=30 Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.677086 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" podStartSLOduration=22.677053334 podStartE2EDuration="22.677053334s" podCreationTimestamp="2025-12-03 11:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:57.660395273 +0000 UTC m=+1581.496323737" watchObservedRunningTime="2025-12-03 11:29:57.677053334 +0000 UTC m=+1581.512981798" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.727098 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=22.727069391 podStartE2EDuration="22.727069391s" podCreationTimestamp="2025-12-03 11:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:57.706976612 +0000 UTC m=+1581.542905086" watchObservedRunningTime="2025-12-03 11:29:57.727069391 +0000 UTC m=+1581.562997855" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.810154 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.917363 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.917483 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-combined-ca-bundle\") pod \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.917552 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-logs\") pod \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.917637 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-httpd-run\") pod \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.917675 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hcdn\" (UniqueName: \"kubernetes.io/projected/eff669e5-d71e-43d2-a59d-9410c8fd46cf-kube-api-access-8hcdn\") pod \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.917697 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-config-data\") pod \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.917830 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-scripts\") pod \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\" (UID: \"eff669e5-d71e-43d2-a59d-9410c8fd46cf\") " Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.918313 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-logs" (OuterVolumeSpecName: "logs") pod "eff669e5-d71e-43d2-a59d-9410c8fd46cf" (UID: "eff669e5-d71e-43d2-a59d-9410c8fd46cf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.918919 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.936980 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "eff669e5-d71e-43d2-a59d-9410c8fd46cf" (UID: "eff669e5-d71e-43d2-a59d-9410c8fd46cf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.987409 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-scripts" (OuterVolumeSpecName: "scripts") pod "eff669e5-d71e-43d2-a59d-9410c8fd46cf" (UID: "eff669e5-d71e-43d2-a59d-9410c8fd46cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.987876 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "eff669e5-d71e-43d2-a59d-9410c8fd46cf" (UID: "eff669e5-d71e-43d2-a59d-9410c8fd46cf"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 11:29:57 crc kubenswrapper[4702]: I1203 11:29:57.988947 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff669e5-d71e-43d2-a59d-9410c8fd46cf-kube-api-access-8hcdn" (OuterVolumeSpecName: "kube-api-access-8hcdn") pod "eff669e5-d71e-43d2-a59d-9410c8fd46cf" (UID: "eff669e5-d71e-43d2-a59d-9410c8fd46cf"). InnerVolumeSpecName "kube-api-access-8hcdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.020904 4702 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eff669e5-d71e-43d2-a59d-9410c8fd46cf-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.021082 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hcdn\" (UniqueName: \"kubernetes.io/projected/eff669e5-d71e-43d2-a59d-9410c8fd46cf-kube-api-access-8hcdn\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.021168 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.021280 4702 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.093692 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eff669e5-d71e-43d2-a59d-9410c8fd46cf" (UID: "eff669e5-d71e-43d2-a59d-9410c8fd46cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.122393 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.140685 4702 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.246402 4702 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.276361 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-v6tdw"] Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.401784 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-config-data" (OuterVolumeSpecName: "config-data") pod "eff669e5-d71e-43d2-a59d-9410c8fd46cf" (UID: "eff669e5-d71e-43d2-a59d-9410c8fd46cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.451385 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff669e5-d71e-43d2-a59d-9410c8fd46cf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.702401 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.703865 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"eff669e5-d71e-43d2-a59d-9410c8fd46cf","Type":"ContainerDied","Data":"0461c499aea97f5a1097d2e986c8f17951725bf045177d1efff974340950ac1c"} Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.703929 4702 scope.go:117] "RemoveContainer" containerID="9250b2e41e11df6f5cdb701470e7ff7c742625e143c09a2a6d196efae91e5a12" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.703930 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.717691 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.719561 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" event={"ID":"98ff257a-9df9-4798-a184-af395ffa6b2e","Type":"ContainerStarted","Data":"958f6a08a03c7385694911656d7a65da3ac4e3a319aab4072e30a14721aa5706"} Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.734640 4702 generic.go:334] "Generic (PLEG): container finished" podID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerID="698d980c72df0c7c92dbff1f5609a6ff7652555258443f6e5ab882b5ae971cda" exitCode=0 Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.734682 4702 generic.go:334] "Generic (PLEG): container finished" podID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerID="3f2a0b14c48cd2331ce1a782d6330fde86ea22cf974d408d8eecd1f70ab57c6b" exitCode=143 Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.734717 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d8b3544-ab03-4842-9feb-fb3164cd3808","Type":"ContainerDied","Data":"698d980c72df0c7c92dbff1f5609a6ff7652555258443f6e5ab882b5ae971cda"} Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.734789 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d8b3544-ab03-4842-9feb-fb3164cd3808","Type":"ContainerDied","Data":"3f2a0b14c48cd2331ce1a782d6330fde86ea22cf974d408d8eecd1f70ab57c6b"} Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.734939 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" podUID="8d77efc6-ba78-4be8-a834-5b677bf05631" containerName="dnsmasq-dns" containerID="cri-o://4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c" gracePeriod=10 Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.807501 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.861505 4702 scope.go:117] "RemoveContainer" containerID="3bc32009e08aba03f2c9010f1165d45f61c7a85efc7d78e7145370423bba77ca" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.866811 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.928110 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:29:58 crc kubenswrapper[4702]: E1203 11:29:58.929495 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerName="glance-log" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.929520 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerName="glance-log" Dec 03 11:29:58 crc kubenswrapper[4702]: E1203 11:29:58.929537 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerName="glance-httpd" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.929544 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerName="glance-httpd" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.930330 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerName="glance-httpd" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.930354 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" containerName="glance-log" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.931801 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.947379 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.948234 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.974142 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.974192 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.974310 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz74s\" (UniqueName: \"kubernetes.io/projected/ed571ff0-25b3-4f67-8403-c432700a8f49-kube-api-access-lz74s\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.974420 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.974457 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.974560 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.974588 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:58 crc kubenswrapper[4702]: I1203 11:29:58.974619 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.081085 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz74s\" (UniqueName: \"kubernetes.io/projected/ed571ff0-25b3-4f67-8403-c432700a8f49-kube-api-access-lz74s\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.081213 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.081265 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.084657 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.084883 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.085271 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.085566 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.085609 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.086501 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.089299 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.089883 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-logs\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.105053 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff669e5-d71e-43d2-a59d-9410c8fd46cf" path="/var/lib/kubelet/pods/eff669e5-d71e-43d2-a59d-9410c8fd46cf/volumes" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.105677 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.106139 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5cb45ddb68-kntb7"] Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.106174 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.138315 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.176340 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.179187 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.179495 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz74s\" (UniqueName: \"kubernetes.io/projected/ed571ff0-25b3-4f67-8403-c432700a8f49-kube-api-access-lz74s\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.214035 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.260737 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.560476 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.604362 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-combined-ca-bundle\") pod \"2d8b3544-ab03-4842-9feb-fb3164cd3808\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.604466 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"2d8b3544-ab03-4842-9feb-fb3164cd3808\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.604519 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-logs\") pod \"2d8b3544-ab03-4842-9feb-fb3164cd3808\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.604573 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-scripts\") pod \"2d8b3544-ab03-4842-9feb-fb3164cd3808\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.604655 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-config-data\") pod \"2d8b3544-ab03-4842-9feb-fb3164cd3808\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.604710 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xt6d\" (UniqueName: \"kubernetes.io/projected/2d8b3544-ab03-4842-9feb-fb3164cd3808-kube-api-access-5xt6d\") pod \"2d8b3544-ab03-4842-9feb-fb3164cd3808\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.604748 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-httpd-run\") pod \"2d8b3544-ab03-4842-9feb-fb3164cd3808\" (UID: \"2d8b3544-ab03-4842-9feb-fb3164cd3808\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.605989 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2d8b3544-ab03-4842-9feb-fb3164cd3808" (UID: "2d8b3544-ab03-4842-9feb-fb3164cd3808"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.606268 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-logs" (OuterVolumeSpecName: "logs") pod "2d8b3544-ab03-4842-9feb-fb3164cd3808" (UID: "2d8b3544-ab03-4842-9feb-fb3164cd3808"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.616850 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "2d8b3544-ab03-4842-9feb-fb3164cd3808" (UID: "2d8b3544-ab03-4842-9feb-fb3164cd3808"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.616844 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d8b3544-ab03-4842-9feb-fb3164cd3808-kube-api-access-5xt6d" (OuterVolumeSpecName: "kube-api-access-5xt6d") pod "2d8b3544-ab03-4842-9feb-fb3164cd3808" (UID: "2d8b3544-ab03-4842-9feb-fb3164cd3808"). InnerVolumeSpecName "kube-api-access-5xt6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.616968 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-scripts" (OuterVolumeSpecName: "scripts") pod "2d8b3544-ab03-4842-9feb-fb3164cd3808" (UID: "2d8b3544-ab03-4842-9feb-fb3164cd3808"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.677534 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.716284 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xt6d\" (UniqueName: \"kubernetes.io/projected/2d8b3544-ab03-4842-9feb-fb3164cd3808-kube-api-access-5xt6d\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.716334 4702 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.716380 4702 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.716393 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8b3544-ab03-4842-9feb-fb3164cd3808-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.716408 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.783731 4702 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.817698 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-config\") pod \"8d77efc6-ba78-4be8-a834-5b677bf05631\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.817788 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-nb\") pod \"8d77efc6-ba78-4be8-a834-5b677bf05631\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.817923 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pgwq\" (UniqueName: \"kubernetes.io/projected/8d77efc6-ba78-4be8-a834-5b677bf05631-kube-api-access-7pgwq\") pod \"8d77efc6-ba78-4be8-a834-5b677bf05631\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.817976 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-swift-storage-0\") pod \"8d77efc6-ba78-4be8-a834-5b677bf05631\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.818097 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-sb\") pod \"8d77efc6-ba78-4be8-a834-5b677bf05631\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.818356 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-svc\") pod \"8d77efc6-ba78-4be8-a834-5b677bf05631\" (UID: \"8d77efc6-ba78-4be8-a834-5b677bf05631\") " Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.819028 4702 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.832863 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-config-data" (OuterVolumeSpecName: "config-data") pod "2d8b3544-ab03-4842-9feb-fb3164cd3808" (UID: "2d8b3544-ab03-4842-9feb-fb3164cd3808"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.833883 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d77efc6-ba78-4be8-a834-5b677bf05631-kube-api-access-7pgwq" (OuterVolumeSpecName: "kube-api-access-7pgwq") pod "8d77efc6-ba78-4be8-a834-5b677bf05631" (UID: "8d77efc6-ba78-4be8-a834-5b677bf05631"). InnerVolumeSpecName "kube-api-access-7pgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.892085 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d8b3544-ab03-4842-9feb-fb3164cd3808" (UID: "2d8b3544-ab03-4842-9feb-fb3164cd3808"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.920104 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2d8b3544-ab03-4842-9feb-fb3164cd3808","Type":"ContainerDied","Data":"610e9314887518fd70e5de6180856c31a2aad426980fd5017693ef3b6f37b510"} Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.920173 4702 scope.go:117] "RemoveContainer" containerID="698d980c72df0c7c92dbff1f5609a6ff7652555258443f6e5ab882b5ae971cda" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.920313 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.920379 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.920415 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pgwq\" (UniqueName: \"kubernetes.io/projected/8d77efc6-ba78-4be8-a834-5b677bf05631-kube-api-access-7pgwq\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.920425 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8b3544-ab03-4842-9feb-fb3164cd3808-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.957333 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb45ddb68-kntb7" event={"ID":"f2c41a10-02a8-438d-a25c-18e9caf9e467","Type":"ContainerStarted","Data":"c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222"} Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.957403 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb45ddb68-kntb7" event={"ID":"f2c41a10-02a8-438d-a25c-18e9caf9e467","Type":"ContainerStarted","Data":"45d29372fbac5132dbc482d007a9447553f248f8814ef5dbbbce162dfede3ace"} Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.958089 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:29:59 crc kubenswrapper[4702]: I1203 11:29:59.979376 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8d77efc6-ba78-4be8-a834-5b677bf05631" (UID: "8d77efc6-ba78-4be8-a834-5b677bf05631"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.010605 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d77efc6-ba78-4be8-a834-5b677bf05631" (UID: "8d77efc6-ba78-4be8-a834-5b677bf05631"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.017529 4702 generic.go:334] "Generic (PLEG): container finished" podID="8d77efc6-ba78-4be8-a834-5b677bf05631" containerID="4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c" exitCode=0 Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.017626 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" event={"ID":"8d77efc6-ba78-4be8-a834-5b677bf05631","Type":"ContainerDied","Data":"4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c"} Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.017659 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" event={"ID":"8d77efc6-ba78-4be8-a834-5b677bf05631","Type":"ContainerDied","Data":"5d68dbfd6658e32f389d6c0afd68ba1a49bfa81e1beca09ee07d84df1dacc010"} Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.017724 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-pd52h" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.023022 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.023067 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.029014 4702 generic.go:334] "Generic (PLEG): container finished" podID="0ca96186-ab1f-41b3-a9f4-c89220b757da" containerID="5509dd1193f09ef67b047350644f34200351689344ac2325ad460db94c52caf3" exitCode=0 Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.029082 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jtqd2" event={"ID":"0ca96186-ab1f-41b3-a9f4-c89220b757da","Type":"ContainerDied","Data":"5509dd1193f09ef67b047350644f34200351689344ac2325ad460db94c52caf3"} Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.034683 4702 generic.go:334] "Generic (PLEG): container finished" podID="98ff257a-9df9-4798-a184-af395ffa6b2e" containerID="3a544af6fcdd191c679907823adfee433ea7caa2a4e5aab4cecaad6343016404" exitCode=0 Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.035259 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" event={"ID":"98ff257a-9df9-4798-a184-af395ffa6b2e","Type":"ContainerDied","Data":"3a544af6fcdd191c679907823adfee433ea7caa2a4e5aab4cecaad6343016404"} Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.053326 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-config" (OuterVolumeSpecName: "config") pod "8d77efc6-ba78-4be8-a834-5b677bf05631" (UID: "8d77efc6-ba78-4be8-a834-5b677bf05631"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.054010 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d77efc6-ba78-4be8-a834-5b677bf05631" (UID: "8d77efc6-ba78-4be8-a834-5b677bf05631"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.057843 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5cb45ddb68-kntb7" podStartSLOduration=4.040404734 podStartE2EDuration="4.040404734s" podCreationTimestamp="2025-12-03 11:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:29:59.994635627 +0000 UTC m=+1583.830564091" watchObservedRunningTime="2025-12-03 11:30:00.040404734 +0000 UTC m=+1583.876333198" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.073792 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.126556 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d77efc6-ba78-4be8-a834-5b677bf05631" (UID: "8d77efc6-ba78-4be8-a834-5b677bf05631"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.134210 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.134274 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.134289 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d77efc6-ba78-4be8-a834-5b677bf05631-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.219806 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp"] Dec 03 11:30:00 crc kubenswrapper[4702]: E1203 11:30:00.220583 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerName="glance-log" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.220610 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerName="glance-log" Dec 03 11:30:00 crc kubenswrapper[4702]: E1203 11:30:00.220637 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d77efc6-ba78-4be8-a834-5b677bf05631" containerName="init" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.220646 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d77efc6-ba78-4be8-a834-5b677bf05631" containerName="init" Dec 03 11:30:00 crc kubenswrapper[4702]: E1203 11:30:00.220664 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d77efc6-ba78-4be8-a834-5b677bf05631" containerName="dnsmasq-dns" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.220673 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d77efc6-ba78-4be8-a834-5b677bf05631" containerName="dnsmasq-dns" Dec 03 11:30:00 crc kubenswrapper[4702]: E1203 11:30:00.220685 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerName="glance-httpd" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.220695 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerName="glance-httpd" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.220990 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d77efc6-ba78-4be8-a834-5b677bf05631" containerName="dnsmasq-dns" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.221025 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerName="glance-log" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.221039 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" containerName="glance-httpd" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.222169 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.230393 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.231742 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.243944 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp"] Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.259608 4702 scope.go:117] "RemoveContainer" containerID="3f2a0b14c48cd2331ce1a782d6330fde86ea22cf974d408d8eecd1f70ab57c6b" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.275102 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.291174 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.311496 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:30:00 crc kubenswrapper[4702]: W1203 11:30:00.314527 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded571ff0_25b3_4f67_8403_c432700a8f49.slice/crio-7bf781867430dfb84d7891e38390c6c609993ccd908d24071f43b98334163459 WatchSource:0}: Error finding container 7bf781867430dfb84d7891e38390c6c609993ccd908d24071f43b98334163459: Status 404 returned error can't find the container with id 7bf781867430dfb84d7891e38390c6c609993ccd908d24071f43b98334163459 Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.339253 4702 scope.go:117] "RemoveContainer" containerID="4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.345564 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5794m\" (UniqueName: \"kubernetes.io/projected/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-kube-api-access-5794m\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.345693 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-secret-volume\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.345922 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-config-volume\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.360598 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.363190 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.369073 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.369371 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.411166 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.418985 4702 scope.go:117] "RemoveContainer" containerID="cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.445860 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pd52h"] Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.449057 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5794m\" (UniqueName: \"kubernetes.io/projected/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-kube-api-access-5794m\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.449138 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-secret-volume\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.449185 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-config-volume\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.460832 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-config-volume\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.468921 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-secret-volume\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.476226 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-pd52h"] Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.489497 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5794m\" (UniqueName: \"kubernetes.io/projected/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-kube-api-access-5794m\") pod \"collect-profiles-29412690-t4nmp\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.543378 4702 scope.go:117] "RemoveContainer" containerID="4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c" Dec 03 11:30:00 crc kubenswrapper[4702]: E1203 11:30:00.543821 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c\": container with ID starting with 4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c not found: ID does not exist" containerID="4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.543859 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c"} err="failed to get container status \"4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c\": rpc error: code = NotFound desc = could not find container \"4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c\": container with ID starting with 4d9032f22fd3daa37ec14dd6bae591b397a4052dd4f17dc3b4bf7d0831afe34c not found: ID does not exist" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.543883 4702 scope.go:117] "RemoveContainer" containerID="cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376" Dec 03 11:30:00 crc kubenswrapper[4702]: E1203 11:30:00.544792 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376\": container with ID starting with cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376 not found: ID does not exist" containerID="cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.544820 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376"} err="failed to get container status \"cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376\": rpc error: code = NotFound desc = could not find container \"cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376\": container with ID starting with cc637863d8e2acbdffb3cda4737aceb231d1e48f68bab9e3a0e6298b68d65376 not found: ID does not exist" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.551885 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.551970 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5lmx\" (UniqueName: \"kubernetes.io/projected/3ad54658-ece0-4731-a07c-3ba8cfb73693-kube-api-access-n5lmx\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.552202 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.552321 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-logs\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.552357 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.552700 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.552816 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.552947 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.567087 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.655707 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.655838 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.655935 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.655973 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5lmx\" (UniqueName: \"kubernetes.io/projected/3ad54658-ece0-4731-a07c-3ba8cfb73693-kube-api-access-n5lmx\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.656100 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.656136 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-logs\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.656155 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.656236 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.658174 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.662462 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.662497 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-logs\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.667437 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-scripts\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.669973 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-config-data\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.670615 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.676678 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.678433 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5lmx\" (UniqueName: \"kubernetes.io/projected/3ad54658-ece0-4731-a07c-3ba8cfb73693-kube-api-access-n5lmx\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.719150 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.732313 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.987740 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d8b3544-ab03-4842-9feb-fb3164cd3808" path="/var/lib/kubelet/pods/2d8b3544-ab03-4842-9feb-fb3164cd3808/volumes" Dec 03 11:30:00 crc kubenswrapper[4702]: I1203 11:30:00.988827 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d77efc6-ba78-4be8-a834-5b677bf05631" path="/var/lib/kubelet/pods/8d77efc6-ba78-4be8-a834-5b677bf05631/volumes" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.081725 4702 generic.go:334] "Generic (PLEG): container finished" podID="c2d4cceb-3a17-486b-8718-897e52ea39cc" containerID="f4c7366329e9fbc8693b7c9cda24f6a5889f0b5aaba9acb1380cd579109f0f7f" exitCode=0 Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.081829 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ds27t" event={"ID":"c2d4cceb-3a17-486b-8718-897e52ea39cc","Type":"ContainerDied","Data":"f4c7366329e9fbc8693b7c9cda24f6a5889f0b5aaba9acb1380cd579109f0f7f"} Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.158430 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" event={"ID":"98ff257a-9df9-4798-a184-af395ffa6b2e","Type":"ContainerStarted","Data":"9b2a7731315c12ead0871d62417b1235434d873d8ace0324d6010a4e111f5729"} Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.158768 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.185136 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed571ff0-25b3-4f67-8403-c432700a8f49","Type":"ContainerStarted","Data":"7bf781867430dfb84d7891e38390c6c609993ccd908d24071f43b98334163459"} Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.200833 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" podStartSLOduration=5.200809122 podStartE2EDuration="5.200809122s" podCreationTimestamp="2025-12-03 11:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:01.199063042 +0000 UTC m=+1585.034991526" watchObservedRunningTime="2025-12-03 11:30:01.200809122 +0000 UTC m=+1585.036737586" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.214979 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb45ddb68-kntb7" event={"ID":"f2c41a10-02a8-438d-a25c-18e9caf9e467","Type":"ContainerStarted","Data":"ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170"} Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.259093 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp"] Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.615624 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:30:01 crc kubenswrapper[4702]: W1203 11:30:01.616464 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ad54658_ece0_4731_a07c_3ba8cfb73693.slice/crio-929fe1519dd4b9565952142c3eec16200f3f06878e30d4a85dfcf1ecf8008ca1 WatchSource:0}: Error finding container 929fe1519dd4b9565952142c3eec16200f3f06878e30d4a85dfcf1ecf8008ca1: Status 404 returned error can't find the container with id 929fe1519dd4b9565952142c3eec16200f3f06878e30d4a85dfcf1ecf8008ca1 Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.835462 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5fdfb45b77-sfz9f"] Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.843839 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.847924 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.849312 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.878031 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fdfb45b77-sfz9f"] Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.926883 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-public-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.927053 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-combined-ca-bundle\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.927076 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4vck\" (UniqueName: \"kubernetes.io/projected/6889403a-b787-4401-a235-0f8297e5844f-kube-api-access-z4vck\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.927104 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-config\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.927121 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-internal-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.927185 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-httpd-config\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:01 crc kubenswrapper[4702]: I1203 11:30:01.927218 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-ovndb-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.030713 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-httpd-config\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.030795 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-ovndb-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.030893 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-public-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.031091 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-combined-ca-bundle\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.031114 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4vck\" (UniqueName: \"kubernetes.io/projected/6889403a-b787-4401-a235-0f8297e5844f-kube-api-access-z4vck\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.031137 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-config\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.031161 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-internal-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.056665 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-internal-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.058323 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-config\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.059977 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-ovndb-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.067174 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-public-tls-certs\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.068679 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-combined-ca-bundle\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.070848 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6889403a-b787-4401-a235-0f8297e5844f-httpd-config\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.070383 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4vck\" (UniqueName: \"kubernetes.io/projected/6889403a-b787-4401-a235-0f8297e5844f-kube-api-access-z4vck\") pod \"neutron-5fdfb45b77-sfz9f\" (UID: \"6889403a-b787-4401-a235-0f8297e5844f\") " pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.201191 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jtqd2" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.237162 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-combined-ca-bundle\") pod \"0ca96186-ab1f-41b3-a9f4-c89220b757da\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.237336 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-config-data\") pod \"0ca96186-ab1f-41b3-a9f4-c89220b757da\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.237415 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-244qk\" (UniqueName: \"kubernetes.io/projected/0ca96186-ab1f-41b3-a9f4-c89220b757da-kube-api-access-244qk\") pod \"0ca96186-ab1f-41b3-a9f4-c89220b757da\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.237466 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-scripts\") pod \"0ca96186-ab1f-41b3-a9f4-c89220b757da\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.237632 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ca96186-ab1f-41b3-a9f4-c89220b757da-logs\") pod \"0ca96186-ab1f-41b3-a9f4-c89220b757da\" (UID: \"0ca96186-ab1f-41b3-a9f4-c89220b757da\") " Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.238747 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ca96186-ab1f-41b3-a9f4-c89220b757da-logs" (OuterVolumeSpecName: "logs") pod "0ca96186-ab1f-41b3-a9f4-c89220b757da" (UID: "0ca96186-ab1f-41b3-a9f4-c89220b757da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.257940 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jtqd2" event={"ID":"0ca96186-ab1f-41b3-a9f4-c89220b757da","Type":"ContainerDied","Data":"62443cf09cb8c391a135e49ef9eb775cfe490225fe5c5263328f7583b8420f66"} Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.258019 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62443cf09cb8c391a135e49ef9eb775cfe490225fe5c5263328f7583b8420f66" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.258132 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jtqd2" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.274276 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ad54658-ece0-4731-a07c-3ba8cfb73693","Type":"ContainerStarted","Data":"929fe1519dd4b9565952142c3eec16200f3f06878e30d4a85dfcf1ecf8008ca1"} Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.280153 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" event={"ID":"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae","Type":"ContainerStarted","Data":"3d80672e2def104fecf62c9b917c0ed3b1c7b1e0a4d5a99d9bfb3f8bf01efa57"} Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.343527 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0ca96186-ab1f-41b3-a9f4-c89220b757da-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.375037 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-scripts" (OuterVolumeSpecName: "scripts") pod "0ca96186-ab1f-41b3-a9f4-c89220b757da" (UID: "0ca96186-ab1f-41b3-a9f4-c89220b757da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.380605 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ca96186-ab1f-41b3-a9f4-c89220b757da-kube-api-access-244qk" (OuterVolumeSpecName: "kube-api-access-244qk") pod "0ca96186-ab1f-41b3-a9f4-c89220b757da" (UID: "0ca96186-ab1f-41b3-a9f4-c89220b757da"). InnerVolumeSpecName "kube-api-access-244qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.381570 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-config-data" (OuterVolumeSpecName: "config-data") pod "0ca96186-ab1f-41b3-a9f4-c89220b757da" (UID: "0ca96186-ab1f-41b3-a9f4-c89220b757da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.382526 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ca96186-ab1f-41b3-a9f4-c89220b757da" (UID: "0ca96186-ab1f-41b3-a9f4-c89220b757da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.446179 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.446226 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.446240 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-244qk\" (UniqueName: \"kubernetes.io/projected/0ca96186-ab1f-41b3-a9f4-c89220b757da-kube-api-access-244qk\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.446257 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0ca96186-ab1f-41b3-a9f4-c89220b757da-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:02 crc kubenswrapper[4702]: I1203 11:30:02.455998 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.374470 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-57f75d96b4-7bvsq"] Dec 03 11:30:03 crc kubenswrapper[4702]: E1203 11:30:03.383483 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ca96186-ab1f-41b3-a9f4-c89220b757da" containerName="placement-db-sync" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.383532 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ca96186-ab1f-41b3-a9f4-c89220b757da" containerName="placement-db-sync" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.384130 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ca96186-ab1f-41b3-a9f4-c89220b757da" containerName="placement-db-sync" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.393274 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.408747 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.408797 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.408866 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.409122 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p7m48" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.409726 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.431002 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57f75d96b4-7bvsq"] Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.495336 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-config-data\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.495768 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4785be1d-0e87-49ae-b5de-56bbab3b5eff-logs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.495949 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-public-tls-certs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.495991 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-internal-tls-certs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.496054 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4px64\" (UniqueName: \"kubernetes.io/projected/4785be1d-0e87-49ae-b5de-56bbab3b5eff-kube-api-access-4px64\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.496253 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-combined-ca-bundle\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.496592 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-scripts\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.599682 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-config-data\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.599797 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4785be1d-0e87-49ae-b5de-56bbab3b5eff-logs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.599912 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-public-tls-certs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.599958 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-internal-tls-certs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.600026 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4px64\" (UniqueName: \"kubernetes.io/projected/4785be1d-0e87-49ae-b5de-56bbab3b5eff-kube-api-access-4px64\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.600050 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-combined-ca-bundle\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.600145 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-scripts\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.602895 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4785be1d-0e87-49ae-b5de-56bbab3b5eff-logs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.609545 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-public-tls-certs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.610134 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-scripts\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.610317 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-config-data\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.613912 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-combined-ca-bundle\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.617526 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4785be1d-0e87-49ae-b5de-56bbab3b5eff-internal-tls-certs\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.627544 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4px64\" (UniqueName: \"kubernetes.io/projected/4785be1d-0e87-49ae-b5de-56bbab3b5eff-kube-api-access-4px64\") pod \"placement-57f75d96b4-7bvsq\" (UID: \"4785be1d-0e87-49ae-b5de-56bbab3b5eff\") " pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:03 crc kubenswrapper[4702]: I1203 11:30:03.732555 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.353563 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-ds27t" event={"ID":"c2d4cceb-3a17-486b-8718-897e52ea39cc","Type":"ContainerDied","Data":"4cf0934be41d2a5d8bd9efca76dda97c0614e1877ecc9a99c33bfccf2362222d"} Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.353941 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cf0934be41d2a5d8bd9efca76dda97c0614e1877ecc9a99c33bfccf2362222d" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.425895 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ds27t" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.513701 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-combined-ca-bundle\") pod \"c2d4cceb-3a17-486b-8718-897e52ea39cc\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.513968 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x6vj\" (UniqueName: \"kubernetes.io/projected/c2d4cceb-3a17-486b-8718-897e52ea39cc-kube-api-access-4x6vj\") pod \"c2d4cceb-3a17-486b-8718-897e52ea39cc\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.514064 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-db-sync-config-data\") pod \"c2d4cceb-3a17-486b-8718-897e52ea39cc\" (UID: \"c2d4cceb-3a17-486b-8718-897e52ea39cc\") " Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.521062 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c2d4cceb-3a17-486b-8718-897e52ea39cc" (UID: "c2d4cceb-3a17-486b-8718-897e52ea39cc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.521954 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d4cceb-3a17-486b-8718-897e52ea39cc-kube-api-access-4x6vj" (OuterVolumeSpecName: "kube-api-access-4x6vj") pod "c2d4cceb-3a17-486b-8718-897e52ea39cc" (UID: "c2d4cceb-3a17-486b-8718-897e52ea39cc"). InnerVolumeSpecName "kube-api-access-4x6vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.555023 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2d4cceb-3a17-486b-8718-897e52ea39cc" (UID: "c2d4cceb-3a17-486b-8718-897e52ea39cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.619154 4702 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.619193 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2d4cceb-3a17-486b-8718-897e52ea39cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.619207 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x6vj\" (UniqueName: \"kubernetes.io/projected/c2d4cceb-3a17-486b-8718-897e52ea39cc-kube-api-access-4x6vj\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.930693 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-57f75d96b4-7bvsq"] Dec 03 11:30:05 crc kubenswrapper[4702]: W1203 11:30:05.934034 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4785be1d_0e87_49ae_b5de_56bbab3b5eff.slice/crio-dceef9b21fac86592bb0dc16a420c771f8700c86c56bd920d0b8fa12c9f40ddd WatchSource:0}: Error finding container dceef9b21fac86592bb0dc16a420c771f8700c86c56bd920d0b8fa12c9f40ddd: Status 404 returned error can't find the container with id dceef9b21fac86592bb0dc16a420c771f8700c86c56bd920d0b8fa12c9f40ddd Dec 03 11:30:05 crc kubenswrapper[4702]: I1203 11:30:05.968059 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fdfb45b77-sfz9f"] Dec 03 11:30:06 crc kubenswrapper[4702]: I1203 11:30:06.372642 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f75d96b4-7bvsq" event={"ID":"4785be1d-0e87-49ae-b5de-56bbab3b5eff","Type":"ContainerStarted","Data":"dceef9b21fac86592bb0dc16a420c771f8700c86c56bd920d0b8fa12c9f40ddd"} Dec 03 11:30:06 crc kubenswrapper[4702]: I1203 11:30:06.375859 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fdfb45b77-sfz9f" event={"ID":"6889403a-b787-4401-a235-0f8297e5844f","Type":"ContainerStarted","Data":"dee27dfe8d84e80507d36c9f6d57ee628229a19fcad79dad0f74027ca3ddabfe"} Dec 03 11:30:06 crc kubenswrapper[4702]: I1203 11:30:06.375955 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-ds27t" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.483071 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.634856 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-gdv8p"] Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.635493 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" podUID="8d966268-b412-4449-beb2-56619ab95323" containerName="dnsmasq-dns" containerID="cri-o://67d4be5006dd688938944229d6adc80a94eb5dcd01def0f846cf3fce0f269974" gracePeriod=10 Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.872844 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl"] Dec 03 11:30:07 crc kubenswrapper[4702]: E1203 11:30:07.873533 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d4cceb-3a17-486b-8718-897e52ea39cc" containerName="barbican-db-sync" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.873551 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d4cceb-3a17-486b-8718-897e52ea39cc" containerName="barbican-db-sync" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.873959 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d4cceb-3a17-486b-8718-897e52ea39cc" containerName="barbican-db-sync" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.875442 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.881367 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-554bc66f45-ddpl4"] Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.883588 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.883875 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-l7bgx" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.883986 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.884093 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.894927 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.901111 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" podUID="8d966268-b412-4449-beb2-56619ab95323" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.184:5353: connect: connection refused" Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.925950 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl"] Dec 03 11:30:07 crc kubenswrapper[4702]: I1203 11:30:07.951218 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-554bc66f45-ddpl4"] Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.215127 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-9kvwl"] Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.218363 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222143 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8600900e-a4f2-484b-8e66-be0b81303777-logs\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222361 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-config-data-custom\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222440 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdvz6\" (UniqueName: \"kubernetes.io/projected/53d4b133-1553-4265-9529-f7237cbe87e6-kube-api-access-zdvz6\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222489 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-combined-ca-bundle\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222644 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53d4b133-1553-4265-9529-f7237cbe87e6-logs\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222729 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-config-data-custom\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222844 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-combined-ca-bundle\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222892 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-config-data\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.222942 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-config-data\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.223019 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx8vk\" (UniqueName: \"kubernetes.io/projected/8600900e-a4f2-484b-8e66-be0b81303777-kube-api-access-sx8vk\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.243007 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-9kvwl"] Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.378692 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8600900e-a4f2-484b-8e66-be0b81303777-logs\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.378785 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-config\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.378842 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-svc\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.378870 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-config-data-custom\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379222 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8600900e-a4f2-484b-8e66-be0b81303777-logs\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379284 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdvz6\" (UniqueName: \"kubernetes.io/projected/53d4b133-1553-4265-9529-f7237cbe87e6-kube-api-access-zdvz6\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379342 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-combined-ca-bundle\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379381 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53d4b133-1553-4265-9529-f7237cbe87e6-logs\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379403 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379480 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-config-data-custom\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379515 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379587 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-combined-ca-bundle\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379622 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-config-data\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379656 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8db7\" (UniqueName: \"kubernetes.io/projected/3377c36f-51de-4fca-9fe9-a0763ab93e36-kube-api-access-v8db7\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379703 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-config-data\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379786 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx8vk\" (UniqueName: \"kubernetes.io/projected/8600900e-a4f2-484b-8e66-be0b81303777-kube-api-access-sx8vk\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.379813 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.380311 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53d4b133-1553-4265-9529-f7237cbe87e6-logs\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.389122 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-combined-ca-bundle\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.391214 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-config-data\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.391319 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8600900e-a4f2-484b-8e66-be0b81303777-config-data-custom\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.404020 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-config-data\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.405565 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-config-data-custom\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.414638 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53d4b133-1553-4265-9529-f7237cbe87e6-combined-ca-bundle\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.431561 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx8vk\" (UniqueName: \"kubernetes.io/projected/8600900e-a4f2-484b-8e66-be0b81303777-kube-api-access-sx8vk\") pod \"barbican-keystone-listener-5b9bd8bd96-9mqbl\" (UID: \"8600900e-a4f2-484b-8e66-be0b81303777\") " pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.484352 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.484814 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-config\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.484867 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-svc\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.484966 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.485026 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.485107 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8db7\" (UniqueName: \"kubernetes.io/projected/3377c36f-51de-4fca-9fe9-a0763ab93e36-kube-api-access-v8db7\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.485990 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-config\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.486690 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-svc\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.486866 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.487558 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdvz6\" (UniqueName: \"kubernetes.io/projected/53d4b133-1553-4265-9529-f7237cbe87e6-kube-api-access-zdvz6\") pod \"barbican-worker-554bc66f45-ddpl4\" (UID: \"53d4b133-1553-4265-9529-f7237cbe87e6\") " pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.488349 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.501847 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.527808 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b7b9cbdb8-zhtqs"] Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.528382 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.534407 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fdfb45b77-sfz9f" event={"ID":"6889403a-b787-4401-a235-0f8297e5844f","Type":"ContainerStarted","Data":"5ec40cdbe5f043bca1d84b9a67ec7e00165ca1a7dac645f3c6a43e07fdd13b31"} Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.534455 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8db7\" (UniqueName: \"kubernetes.io/projected/3377c36f-51de-4fca-9fe9-a0763ab93e36-kube-api-access-v8db7\") pod \"dnsmasq-dns-85ff748b95-9kvwl\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.534558 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.539923 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.542582 4702 generic.go:334] "Generic (PLEG): container finished" podID="65cdf028-89d6-4f6e-8aae-bdf5e8264310" containerID="c68dde3062d8980ea81d1539226a9d5aa4e10038b93c8095bc55487691b6bf57" exitCode=0 Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.542680 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r7v59" event={"ID":"65cdf028-89d6-4f6e-8aae-bdf5e8264310","Type":"ContainerDied","Data":"c68dde3062d8980ea81d1539226a9d5aa4e10038b93c8095bc55487691b6bf57"} Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.549314 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ad54658-ece0-4731-a07c-3ba8cfb73693","Type":"ContainerStarted","Data":"d163f2220739998c6faa435f3851d3ed49b8280d12a98ffc9374079f638ef985"} Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.549373 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b7b9cbdb8-zhtqs"] Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.559185 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed571ff0-25b3-4f67-8403-c432700a8f49","Type":"ContainerStarted","Data":"8c8d28104afb66581b5417efcdb44cbb8650878e8ff071009afaab0a819f8a0a"} Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.570782 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-554bc66f45-ddpl4" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.570950 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" event={"ID":"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae","Type":"ContainerStarted","Data":"fb4d9241855e57eff87ccc34d7c9e3440d1e9efbf264701eece7a2b3b76f01d1"} Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.575861 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f75d96b4-7bvsq" event={"ID":"4785be1d-0e87-49ae-b5de-56bbab3b5eff","Type":"ContainerStarted","Data":"552514ea3c9f2a41eccd9f8cc93fae9bbc11bcb24385594cd308fce7342459f3"} Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.677661 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.689356 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data-custom\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.689476 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-combined-ca-bundle\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.689583 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.689650 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb4xr\" (UniqueName: \"kubernetes.io/projected/c82952f3-6193-4d1d-8d66-c1e9624cf937-kube-api-access-vb4xr\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.689712 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82952f3-6193-4d1d-8d66-c1e9624cf937-logs\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.800639 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82952f3-6193-4d1d-8d66-c1e9624cf937-logs\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.800993 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data-custom\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.801150 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-combined-ca-bundle\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.801254 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.801364 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb4xr\" (UniqueName: \"kubernetes.io/projected/c82952f3-6193-4d1d-8d66-c1e9624cf937-kube-api-access-vb4xr\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.811824 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82952f3-6193-4d1d-8d66-c1e9624cf937-logs\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.830188 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data-custom\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.831578 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb4xr\" (UniqueName: \"kubernetes.io/projected/c82952f3-6193-4d1d-8d66-c1e9624cf937-kube-api-access-vb4xr\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.838419 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-combined-ca-bundle\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:08 crc kubenswrapper[4702]: I1203 11:30:08.845081 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data\") pod \"barbican-api-5b7b9cbdb8-zhtqs\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:09 crc kubenswrapper[4702]: I1203 11:30:09.155404 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:09 crc kubenswrapper[4702]: I1203 11:30:09.260323 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl"] Dec 03 11:30:09 crc kubenswrapper[4702]: I1203 11:30:09.463775 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-554bc66f45-ddpl4"] Dec 03 11:30:09 crc kubenswrapper[4702]: I1203 11:30:09.625980 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-9kvwl"] Dec 03 11:30:09 crc kubenswrapper[4702]: I1203 11:30:09.661470 4702 generic.go:334] "Generic (PLEG): container finished" podID="8d966268-b412-4449-beb2-56619ab95323" containerID="67d4be5006dd688938944229d6adc80a94eb5dcd01def0f846cf3fce0f269974" exitCode=0 Dec 03 11:30:09 crc kubenswrapper[4702]: I1203 11:30:09.661640 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" event={"ID":"8d966268-b412-4449-beb2-56619ab95323","Type":"ContainerDied","Data":"67d4be5006dd688938944229d6adc80a94eb5dcd01def0f846cf3fce0f269974"} Dec 03 11:30:09 crc kubenswrapper[4702]: I1203 11:30:09.670261 4702 generic.go:334] "Generic (PLEG): container finished" podID="8522bf51-2bbc-49dd-b7f8-56abb9ee6cae" containerID="fb4d9241855e57eff87ccc34d7c9e3440d1e9efbf264701eece7a2b3b76f01d1" exitCode=0 Dec 03 11:30:09 crc kubenswrapper[4702]: I1203 11:30:09.670556 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" event={"ID":"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae","Type":"ContainerDied","Data":"fb4d9241855e57eff87ccc34d7c9e3440d1e9efbf264701eece7a2b3b76f01d1"} Dec 03 11:30:10 crc kubenswrapper[4702]: I1203 11:30:10.690954 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ad54658-ece0-4731-a07c-3ba8cfb73693","Type":"ContainerStarted","Data":"6f19a085bbe53ccb297c996ab5745a0fe8db1aa865b3b0612be0c34715db636e"} Dec 03 11:30:10 crc kubenswrapper[4702]: I1203 11:30:10.695177 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed571ff0-25b3-4f67-8403-c432700a8f49","Type":"ContainerStarted","Data":"cfc85875c4a77cdd1ec4a77059a5aba6ea95607de35294a53136260dc568876a"} Dec 03 11:30:10 crc kubenswrapper[4702]: I1203 11:30:10.720984 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=10.720953562 podStartE2EDuration="10.720953562s" podCreationTimestamp="2025-12-03 11:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:10.712017869 +0000 UTC m=+1594.547946353" watchObservedRunningTime="2025-12-03 11:30:10.720953562 +0000 UTC m=+1594.556882016" Dec 03 11:30:10 crc kubenswrapper[4702]: I1203 11:30:10.733548 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 11:30:10 crc kubenswrapper[4702]: I1203 11:30:10.733834 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 11:30:10 crc kubenswrapper[4702]: I1203 11:30:10.754679 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.754654996 podStartE2EDuration="12.754654996s" podCreationTimestamp="2025-12-03 11:29:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:10.745097816 +0000 UTC m=+1594.581026380" watchObservedRunningTime="2025-12-03 11:30:10.754654996 +0000 UTC m=+1594.590583460" Dec 03 11:30:10 crc kubenswrapper[4702]: I1203 11:30:10.797800 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 11:30:10 crc kubenswrapper[4702]: I1203 11:30:10.815864 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 11:30:11 crc kubenswrapper[4702]: I1203 11:30:11.715019 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 11:30:11 crc kubenswrapper[4702]: I1203 11:30:11.715335 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 11:30:11 crc kubenswrapper[4702]: I1203 11:30:11.905308 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-76f68f8c78-w8n8j"] Dec 03 11:30:11 crc kubenswrapper[4702]: I1203 11:30:11.909208 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:11 crc kubenswrapper[4702]: I1203 11:30:11.916709 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 11:30:11 crc kubenswrapper[4702]: I1203 11:30:11.916855 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 11:30:11 crc kubenswrapper[4702]: I1203 11:30:11.940230 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76f68f8c78-w8n8j"] Dec 03 11:30:11 crc kubenswrapper[4702]: W1203 11:30:11.964026 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3377c36f_51de_4fca_9fe9_a0763ab93e36.slice/crio-f1a33d317ecf161c450206573e2c42216d321109c4a4ab58a6948ad5d00808d8 WatchSource:0}: Error finding container f1a33d317ecf161c450206573e2c42216d321109c4a4ab58a6948ad5d00808d8: Status 404 returned error can't find the container with id f1a33d317ecf161c450206573e2c42216d321109c4a4ab58a6948ad5d00808d8 Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.021979 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-public-tls-certs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.022726 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-internal-tls-certs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.022952 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-combined-ca-bundle\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.023036 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36f29f89-b01b-4656-ba86-a2f731d0c1e0-logs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.023108 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4nq9\" (UniqueName: \"kubernetes.io/projected/36f29f89-b01b-4656-ba86-a2f731d0c1e0-kube-api-access-z4nq9\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.023800 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-config-data\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.023981 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-config-data-custom\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.126205 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-combined-ca-bundle\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.126258 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36f29f89-b01b-4656-ba86-a2f731d0c1e0-logs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.126277 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4nq9\" (UniqueName: \"kubernetes.io/projected/36f29f89-b01b-4656-ba86-a2f731d0c1e0-kube-api-access-z4nq9\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.126676 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-config-data\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.126722 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-config-data-custom\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.126768 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-public-tls-certs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.126806 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-internal-tls-certs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.132135 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-internal-tls-certs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.138581 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-combined-ca-bundle\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.138652 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-config-data\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.138920 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/36f29f89-b01b-4656-ba86-a2f731d0c1e0-logs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.144159 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-public-tls-certs\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.144573 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36f29f89-b01b-4656-ba86-a2f731d0c1e0-config-data-custom\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.170571 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4nq9\" (UniqueName: \"kubernetes.io/projected/36f29f89-b01b-4656-ba86-a2f731d0c1e0-kube-api-access-z4nq9\") pod \"barbican-api-76f68f8c78-w8n8j\" (UID: \"36f29f89-b01b-4656-ba86-a2f731d0c1e0\") " pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.278402 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.292942 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.312340 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.317064 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452246 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-svc\") pod \"8d966268-b412-4449-beb2-56619ab95323\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452353 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-combined-ca-bundle\") pod \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452377 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-fernet-keys\") pod \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452412 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-config-volume\") pod \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452435 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5794m\" (UniqueName: \"kubernetes.io/projected/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-kube-api-access-5794m\") pod \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452464 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-scripts\") pod \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452614 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-secret-volume\") pod \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\" (UID: \"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452663 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7p7g\" (UniqueName: \"kubernetes.io/projected/65cdf028-89d6-4f6e-8aae-bdf5e8264310-kube-api-access-v7p7g\") pod \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452682 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5m9x\" (UniqueName: \"kubernetes.io/projected/8d966268-b412-4449-beb2-56619ab95323-kube-api-access-m5m9x\") pod \"8d966268-b412-4449-beb2-56619ab95323\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452726 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-config\") pod \"8d966268-b412-4449-beb2-56619ab95323\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452805 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-nb\") pod \"8d966268-b412-4449-beb2-56619ab95323\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452867 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-credential-keys\") pod \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452898 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-sb\") pod \"8d966268-b412-4449-beb2-56619ab95323\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452915 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-swift-storage-0\") pod \"8d966268-b412-4449-beb2-56619ab95323\" (UID: \"8d966268-b412-4449-beb2-56619ab95323\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.452968 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-config-data\") pod \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\" (UID: \"65cdf028-89d6-4f6e-8aae-bdf5e8264310\") " Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.453447 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-config-volume" (OuterVolumeSpecName: "config-volume") pod "8522bf51-2bbc-49dd-b7f8-56abb9ee6cae" (UID: "8522bf51-2bbc-49dd-b7f8-56abb9ee6cae"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.453672 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.480082 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d966268-b412-4449-beb2-56619ab95323-kube-api-access-m5m9x" (OuterVolumeSpecName: "kube-api-access-m5m9x") pod "8d966268-b412-4449-beb2-56619ab95323" (UID: "8d966268-b412-4449-beb2-56619ab95323"). InnerVolumeSpecName "kube-api-access-m5m9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.499956 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "65cdf028-89d6-4f6e-8aae-bdf5e8264310" (UID: "65cdf028-89d6-4f6e-8aae-bdf5e8264310"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.500169 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "65cdf028-89d6-4f6e-8aae-bdf5e8264310" (UID: "65cdf028-89d6-4f6e-8aae-bdf5e8264310"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.514064 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8522bf51-2bbc-49dd-b7f8-56abb9ee6cae" (UID: "8522bf51-2bbc-49dd-b7f8-56abb9ee6cae"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.538810 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-kube-api-access-5794m" (OuterVolumeSpecName: "kube-api-access-5794m") pod "8522bf51-2bbc-49dd-b7f8-56abb9ee6cae" (UID: "8522bf51-2bbc-49dd-b7f8-56abb9ee6cae"). InnerVolumeSpecName "kube-api-access-5794m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.543821 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65cdf028-89d6-4f6e-8aae-bdf5e8264310" (UID: "65cdf028-89d6-4f6e-8aae-bdf5e8264310"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.547899 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-scripts" (OuterVolumeSpecName: "scripts") pod "65cdf028-89d6-4f6e-8aae-bdf5e8264310" (UID: "65cdf028-89d6-4f6e-8aae-bdf5e8264310"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.548088 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65cdf028-89d6-4f6e-8aae-bdf5e8264310-kube-api-access-v7p7g" (OuterVolumeSpecName: "kube-api-access-v7p7g") pod "65cdf028-89d6-4f6e-8aae-bdf5e8264310" (UID: "65cdf028-89d6-4f6e-8aae-bdf5e8264310"). InnerVolumeSpecName "kube-api-access-v7p7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.559919 4702 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.559976 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7p7g\" (UniqueName: \"kubernetes.io/projected/65cdf028-89d6-4f6e-8aae-bdf5e8264310-kube-api-access-v7p7g\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.559987 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5m9x\" (UniqueName: \"kubernetes.io/projected/8d966268-b412-4449-beb2-56619ab95323-kube-api-access-m5m9x\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.559995 4702 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.560008 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.560020 4702 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.560027 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5794m\" (UniqueName: \"kubernetes.io/projected/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae-kube-api-access-5794m\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.560037 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.594193 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8d966268-b412-4449-beb2-56619ab95323" (UID: "8d966268-b412-4449-beb2-56619ab95323"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.599977 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-config-data" (OuterVolumeSpecName: "config-data") pod "65cdf028-89d6-4f6e-8aae-bdf5e8264310" (UID: "65cdf028-89d6-4f6e-8aae-bdf5e8264310"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.610665 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8d966268-b412-4449-beb2-56619ab95323" (UID: "8d966268-b412-4449-beb2-56619ab95323"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.610710 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-config" (OuterVolumeSpecName: "config") pod "8d966268-b412-4449-beb2-56619ab95323" (UID: "8d966268-b412-4449-beb2-56619ab95323"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.617279 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8d966268-b412-4449-beb2-56619ab95323" (UID: "8d966268-b412-4449-beb2-56619ab95323"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.617897 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b7b9cbdb8-zhtqs"] Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.641536 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8d966268-b412-4449-beb2-56619ab95323" (UID: "8d966268-b412-4449-beb2-56619ab95323"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.667102 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.667160 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.667176 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.667189 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.667202 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65cdf028-89d6-4f6e-8aae-bdf5e8264310-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.667214 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8d966268-b412-4449-beb2-56619ab95323-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.730621 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" event={"ID":"8600900e-a4f2-484b-8e66-be0b81303777","Type":"ContainerStarted","Data":"35657aeb3a5cb28e4302296f2b4dacf192ebfc387c6327df6f93cefdc21ce503"} Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.736372 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" event={"ID":"8d966268-b412-4449-beb2-56619ab95323","Type":"ContainerDied","Data":"36201b6c0968fa32262f109e52005a1794c3a6e0a7b68773cb58db94a1b64e18"} Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.736440 4702 scope.go:117] "RemoveContainer" containerID="67d4be5006dd688938944229d6adc80a94eb5dcd01def0f846cf3fce0f269974" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.736874 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-gdv8p" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.744364 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.744396 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp" event={"ID":"8522bf51-2bbc-49dd-b7f8-56abb9ee6cae","Type":"ContainerDied","Data":"3d80672e2def104fecf62c9b917c0ed3b1c7b1e0a4d5a99d9bfb3f8bf01efa57"} Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.744456 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d80672e2def104fecf62c9b917c0ed3b1c7b1e0a4d5a99d9bfb3f8bf01efa57" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.747194 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" event={"ID":"3377c36f-51de-4fca-9fe9-a0763ab93e36","Type":"ContainerStarted","Data":"f1a33d317ecf161c450206573e2c42216d321109c4a4ab58a6948ad5d00808d8"} Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.751392 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r7v59" event={"ID":"65cdf028-89d6-4f6e-8aae-bdf5e8264310","Type":"ContainerDied","Data":"b5d201e7dc8fa636053079934b99bf7c8516891499fc59b7d84422128af1c119"} Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.751436 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5d201e7dc8fa636053079934b99bf7c8516891499fc59b7d84422128af1c119" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.751510 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r7v59" Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.757991 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-554bc66f45-ddpl4" event={"ID":"53d4b133-1553-4265-9529-f7237cbe87e6","Type":"ContainerStarted","Data":"ccaf4f6819c791d250fddce8b5269613d84a9c1523d6bc3de040f259c544a65f"} Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.819346 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-gdv8p"] Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.835656 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-gdv8p"] Dec 03 11:30:12 crc kubenswrapper[4702]: I1203 11:30:12.990071 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d966268-b412-4449-beb2-56619ab95323" path="/var/lib/kubelet/pods/8d966268-b412-4449-beb2-56619ab95323/volumes" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.489807 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-694848776d-v64gd"] Dec 03 11:30:13 crc kubenswrapper[4702]: E1203 11:30:13.490704 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d966268-b412-4449-beb2-56619ab95323" containerName="init" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.490728 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d966268-b412-4449-beb2-56619ab95323" containerName="init" Dec 03 11:30:13 crc kubenswrapper[4702]: E1203 11:30:13.490772 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8522bf51-2bbc-49dd-b7f8-56abb9ee6cae" containerName="collect-profiles" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.490782 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8522bf51-2bbc-49dd-b7f8-56abb9ee6cae" containerName="collect-profiles" Dec 03 11:30:13 crc kubenswrapper[4702]: E1203 11:30:13.490805 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d966268-b412-4449-beb2-56619ab95323" containerName="dnsmasq-dns" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.490811 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d966268-b412-4449-beb2-56619ab95323" containerName="dnsmasq-dns" Dec 03 11:30:13 crc kubenswrapper[4702]: E1203 11:30:13.490826 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65cdf028-89d6-4f6e-8aae-bdf5e8264310" containerName="keystone-bootstrap" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.490833 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="65cdf028-89d6-4f6e-8aae-bdf5e8264310" containerName="keystone-bootstrap" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.491038 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="65cdf028-89d6-4f6e-8aae-bdf5e8264310" containerName="keystone-bootstrap" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.491063 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d966268-b412-4449-beb2-56619ab95323" containerName="dnsmasq-dns" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.491074 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="8522bf51-2bbc-49dd-b7f8-56abb9ee6cae" containerName="collect-profiles" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.492078 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.497258 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.497490 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.497707 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.497974 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jnt45" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.498061 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.498192 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.535844 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-694848776d-v64gd"] Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.542823 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-public-tls-certs\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.542894 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-internal-tls-certs\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.542947 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-fernet-keys\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.543038 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-config-data\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.543063 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-credential-keys\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.543099 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-combined-ca-bundle\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.543269 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-scripts\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.543290 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n4zm\" (UniqueName: \"kubernetes.io/projected/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-kube-api-access-5n4zm\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.647060 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-public-tls-certs\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.647169 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-internal-tls-certs\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.647211 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-fernet-keys\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.647311 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-config-data\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.647334 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-credential-keys\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.647375 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-combined-ca-bundle\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.647609 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-scripts\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.647633 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n4zm\" (UniqueName: \"kubernetes.io/projected/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-kube-api-access-5n4zm\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.653908 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-internal-tls-certs\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.654259 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-fernet-keys\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.655207 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-scripts\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.655898 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-credential-keys\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.661552 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-public-tls-certs\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.661577 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-config-data\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.662247 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-combined-ca-bundle\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.669293 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n4zm\" (UniqueName: \"kubernetes.io/projected/b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd-kube-api-access-5n4zm\") pod \"keystone-694848776d-v64gd\" (UID: \"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd\") " pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:13 crc kubenswrapper[4702]: I1203 11:30:13.818594 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:15 crc kubenswrapper[4702]: I1203 11:30:15.526957 4702 scope.go:117] "RemoveContainer" containerID="9b9781a1a444d17c459911255bbf537ac846ffe781dbb9a9784dd36cdab1fd7a" Dec 03 11:30:15 crc kubenswrapper[4702]: I1203 11:30:15.835526 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" event={"ID":"c82952f3-6193-4d1d-8d66-c1e9624cf937","Type":"ContainerStarted","Data":"7f7d340fe5d6dbbb79d628b34ad53e1f48d4d8310f5d0469dad57af9f85172d4"} Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.181388 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-76f68f8c78-w8n8j"] Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.401032 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-694848776d-v64gd"] Dec 03 11:30:16 crc kubenswrapper[4702]: W1203 11:30:16.446038 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7119b1c_f3d7_44d8_9c5a_c8e8c4cfdfcd.slice/crio-1a9b91a718a796d2d0ab25c6d9b0e2aaf5f22ede6fad6ceab3354829d8e37dbc WatchSource:0}: Error finding container 1a9b91a718a796d2d0ab25c6d9b0e2aaf5f22ede6fad6ceab3354829d8e37dbc: Status 404 returned error can't find the container with id 1a9b91a718a796d2d0ab25c6d9b0e2aaf5f22ede6fad6ceab3354829d8e37dbc Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.887081 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fdfb45b77-sfz9f" event={"ID":"6889403a-b787-4401-a235-0f8297e5844f","Type":"ContainerStarted","Data":"c47134bb15ae0ba42b1e386ffbbd673c1a624cac3c9d1b8ae58b28d5a4fb318e"} Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.887627 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.903524 4702 generic.go:334] "Generic (PLEG): container finished" podID="3377c36f-51de-4fca-9fe9-a0763ab93e36" containerID="865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3" exitCode=0 Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.903623 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" event={"ID":"3377c36f-51de-4fca-9fe9-a0763ab93e36","Type":"ContainerDied","Data":"865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3"} Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.913428 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" event={"ID":"c82952f3-6193-4d1d-8d66-c1e9624cf937","Type":"ContainerStarted","Data":"4fe6b1487be4a5eba9dca6dee794022a1a9cc9fde36dcae09d158e87e5f7bc03"} Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.924874 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5fdfb45b77-sfz9f" podStartSLOduration=15.924843452 podStartE2EDuration="15.924843452s" podCreationTimestamp="2025-12-03 11:30:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:16.914843509 +0000 UTC m=+1600.750771973" watchObservedRunningTime="2025-12-03 11:30:16.924843452 +0000 UTC m=+1600.760771916" Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.979163 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-694848776d-v64gd" event={"ID":"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd","Type":"ContainerStarted","Data":"1a9b91a718a796d2d0ab25c6d9b0e2aaf5f22ede6fad6ceab3354829d8e37dbc"} Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.979231 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76f68f8c78-w8n8j" event={"ID":"36f29f89-b01b-4656-ba86-a2f731d0c1e0","Type":"ContainerStarted","Data":"89aeb0d812538b345081f488f56facbafa59118eb2a8a676235b2c235d01d60c"} Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.993938 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-57f75d96b4-7bvsq" event={"ID":"4785be1d-0e87-49ae-b5de-56bbab3b5eff","Type":"ContainerStarted","Data":"f8a78c2807f4520157b1e13494a7cb344027f769bcccfeff2644a96ba20f096b"} Dec 03 11:30:16 crc kubenswrapper[4702]: I1203 11:30:16.994265 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:17 crc kubenswrapper[4702]: I1203 11:30:17.291501 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-57f75d96b4-7bvsq" podStartSLOduration=14.291474056 podStartE2EDuration="14.291474056s" podCreationTimestamp="2025-12-03 11:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:17.2167387 +0000 UTC m=+1601.052667164" watchObservedRunningTime="2025-12-03 11:30:17.291474056 +0000 UTC m=+1601.127402510" Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.011633 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wjjt9" event={"ID":"6c60c306-2c56-44e4-8482-e5a72eccd765","Type":"ContainerStarted","Data":"e8d57bec2cdcf4deaf9da6d8b84928fdde30e921d3658e1bfe81705e6a8659b4"} Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.017874 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" event={"ID":"3377c36f-51de-4fca-9fe9-a0763ab93e36","Type":"ContainerStarted","Data":"0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a"} Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.017920 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.024183 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b53ad8d-06e2-4511-b33a-2ff6c6209861","Type":"ContainerStarted","Data":"1d9705dc7dd730c89e59d77d48a0ddb1fa2e1afb386ee1a13032d2f8e31394c4"} Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.027084 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" event={"ID":"c82952f3-6193-4d1d-8d66-c1e9624cf937","Type":"ContainerStarted","Data":"f7fdbb297c180e900c4327579dff7f41cee5b9ecf43d664721835cefad8ac2c5"} Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.149622 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.155091 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.217833 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-694848776d-v64gd" event={"ID":"b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd","Type":"ContainerStarted","Data":"d5ba376cf9af72c05998ffda96af478fc1973bb67c1f64a369665b9ba920f51d"} Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.218295 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.222459 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-wjjt9" podStartSLOduration=3.774890047 podStartE2EDuration="1m2.222436605s" podCreationTimestamp="2025-12-03 11:29:16 +0000 UTC" firstStartedPulling="2025-12-03 11:29:18.09233554 +0000 UTC m=+1541.928264014" lastFinishedPulling="2025-12-03 11:30:16.539882108 +0000 UTC m=+1600.375810572" observedRunningTime="2025-12-03 11:30:18.211603209 +0000 UTC m=+1602.047531673" watchObservedRunningTime="2025-12-03 11:30:18.222436605 +0000 UTC m=+1602.058365089" Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.222877 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76f68f8c78-w8n8j" event={"ID":"36f29f89-b01b-4656-ba86-a2f731d0c1e0","Type":"ContainerStarted","Data":"e43b1ba94cede196cebdb4b1abcd90ad7d66438560bcb81e2be525aeca5f0150"} Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.223169 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.247918 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podStartSLOduration=10.247894007 podStartE2EDuration="10.247894007s" podCreationTimestamp="2025-12-03 11:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:18.237981996 +0000 UTC m=+1602.073910460" watchObservedRunningTime="2025-12-03 11:30:18.247894007 +0000 UTC m=+1602.083822471" Dec 03 11:30:18 crc kubenswrapper[4702]: I1203 11:30:18.279224 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" podStartSLOduration=10.279203363 podStartE2EDuration="10.279203363s" podCreationTimestamp="2025-12-03 11:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:18.270472416 +0000 UTC m=+1602.106400900" watchObservedRunningTime="2025-12-03 11:30:18.279203363 +0000 UTC m=+1602.115131827" Dec 03 11:30:19 crc kubenswrapper[4702]: I1203 11:30:19.232945 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:30:19 crc kubenswrapper[4702]: I1203 11:30:19.262060 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 11:30:19 crc kubenswrapper[4702]: I1203 11:30:19.262134 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 11:30:20 crc kubenswrapper[4702]: I1203 11:30:20.314347 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 11:30:20 crc kubenswrapper[4702]: I1203 11:30:20.314629 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 11:30:20 crc kubenswrapper[4702]: I1203 11:30:20.315175 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 11:30:20 crc kubenswrapper[4702]: I1203 11:30:20.315194 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 11:30:20 crc kubenswrapper[4702]: I1203 11:30:20.341878 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-694848776d-v64gd" podStartSLOduration=7.341809994 podStartE2EDuration="7.341809994s" podCreationTimestamp="2025-12-03 11:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:18.300300721 +0000 UTC m=+1602.136229195" watchObservedRunningTime="2025-12-03 11:30:20.341809994 +0000 UTC m=+1604.177738458" Dec 03 11:30:20 crc kubenswrapper[4702]: I1203 11:30:20.662980 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 11:30:20 crc kubenswrapper[4702]: I1203 11:30:20.973566 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:21 crc kubenswrapper[4702]: I1203 11:30:21.305505 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tqs2b" event={"ID":"20bf2147-401b-457b-ad27-3c893be5fa2c","Type":"ContainerStarted","Data":"39d9a6c7bef6a87823d2e98dc4833752629526b9bfd845df0aa453d7de14260f"} Dec 03 11:30:21 crc kubenswrapper[4702]: I1203 11:30:21.331030 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-tqs2b" podStartSLOduration=8.231789555 podStartE2EDuration="1m5.331006703s" podCreationTimestamp="2025-12-03 11:29:16 +0000 UTC" firstStartedPulling="2025-12-03 11:29:19.443352546 +0000 UTC m=+1543.279281010" lastFinishedPulling="2025-12-03 11:30:16.542569684 +0000 UTC m=+1600.378498158" observedRunningTime="2025-12-03 11:30:21.326500675 +0000 UTC m=+1605.162429139" watchObservedRunningTime="2025-12-03 11:30:21.331006703 +0000 UTC m=+1605.166935167" Dec 03 11:30:22 crc kubenswrapper[4702]: I1203 11:30:22.405689 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-76f68f8c78-w8n8j" event={"ID":"36f29f89-b01b-4656-ba86-a2f731d0c1e0","Type":"ContainerStarted","Data":"74ed98802c4c3e3edd69d7854c9d5ca4f2bc9f6401f1e53225a265110f817715"} Dec 03 11:30:22 crc kubenswrapper[4702]: I1203 11:30:22.407637 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:22 crc kubenswrapper[4702]: I1203 11:30:22.407984 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:22 crc kubenswrapper[4702]: I1203 11:30:22.411423 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-554bc66f45-ddpl4" event={"ID":"53d4b133-1553-4265-9529-f7237cbe87e6","Type":"ContainerStarted","Data":"25c792d37f38892484c872199d7971d2ffe890ab163dc2586f99c2d0459041ec"} Dec 03 11:30:22 crc kubenswrapper[4702]: I1203 11:30:22.418961 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76f68f8c78-w8n8j" podUID="36f29f89-b01b-4656-ba86-a2f731d0c1e0" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.201:9311/healthcheck\": dial tcp 10.217.0.201:9311: connect: connection refused" Dec 03 11:30:22 crc kubenswrapper[4702]: I1203 11:30:22.424603 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" event={"ID":"8600900e-a4f2-484b-8e66-be0b81303777","Type":"ContainerStarted","Data":"960bf6aca63fd17aeddb2769aec41c3be2f8b7a1de9528159605c08729e060c4"} Dec 03 11:30:22 crc kubenswrapper[4702]: I1203 11:30:22.433786 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-76f68f8c78-w8n8j" podStartSLOduration=11.433742567 podStartE2EDuration="11.433742567s" podCreationTimestamp="2025-12-03 11:30:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:22.433106679 +0000 UTC m=+1606.269035143" watchObservedRunningTime="2025-12-03 11:30:22.433742567 +0000 UTC m=+1606.269671031" Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.405732 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.453782 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-554bc66f45-ddpl4" event={"ID":"53d4b133-1553-4265-9529-f7237cbe87e6","Type":"ContainerStarted","Data":"13c70881e1313e5d55ef1ec9d82206096549670886a2bc4017ff3750dc31b730"} Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.475554 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" event={"ID":"8600900e-a4f2-484b-8e66-be0b81303777","Type":"ContainerStarted","Data":"2313ce39987e5a6fd60f7b68964bad711569ae54295e10d2ffa4b754730fb4e0"} Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.476156 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.476432 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.488413 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-554bc66f45-ddpl4" podStartSLOduration=7.063364723 podStartE2EDuration="16.488394119s" podCreationTimestamp="2025-12-03 11:30:07 +0000 UTC" firstStartedPulling="2025-12-03 11:30:11.943435458 +0000 UTC m=+1595.779363922" lastFinishedPulling="2025-12-03 11:30:21.368464854 +0000 UTC m=+1605.204393318" observedRunningTime="2025-12-03 11:30:23.487517404 +0000 UTC m=+1607.323445878" watchObservedRunningTime="2025-12-03 11:30:23.488394119 +0000 UTC m=+1607.324322583" Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.517097 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.680210 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.688246 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b9bd8bd96-9mqbl" podStartSLOduration=7.265498029 podStartE2EDuration="16.688222299s" podCreationTimestamp="2025-12-03 11:30:07 +0000 UTC" firstStartedPulling="2025-12-03 11:30:11.943396727 +0000 UTC m=+1595.779325191" lastFinishedPulling="2025-12-03 11:30:21.366120997 +0000 UTC m=+1605.202049461" observedRunningTime="2025-12-03 11:30:23.551415704 +0000 UTC m=+1607.387344178" watchObservedRunningTime="2025-12-03 11:30:23.688222299 +0000 UTC m=+1607.524150753" Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.834289 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-v6tdw"] Dec 03 11:30:23 crc kubenswrapper[4702]: I1203 11:30:23.834610 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" podUID="98ff257a-9df9-4798-a184-af395ffa6b2e" containerName="dnsmasq-dns" containerID="cri-o://9b2a7731315c12ead0871d62417b1235434d873d8ace0324d6010a4e111f5729" gracePeriod=10 Dec 03 11:30:24 crc kubenswrapper[4702]: I1203 11:30:24.497883 4702 generic.go:334] "Generic (PLEG): container finished" podID="98ff257a-9df9-4798-a184-af395ffa6b2e" containerID="9b2a7731315c12ead0871d62417b1235434d873d8ace0324d6010a4e111f5729" exitCode=0 Dec 03 11:30:24 crc kubenswrapper[4702]: I1203 11:30:24.497972 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" event={"ID":"98ff257a-9df9-4798-a184-af395ffa6b2e","Type":"ContainerDied","Data":"9b2a7731315c12ead0871d62417b1235434d873d8ace0324d6010a4e111f5729"} Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.217231 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.330409 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2qw6\" (UniqueName: \"kubernetes.io/projected/98ff257a-9df9-4798-a184-af395ffa6b2e-kube-api-access-c2qw6\") pod \"98ff257a-9df9-4798-a184-af395ffa6b2e\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.330543 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-nb\") pod \"98ff257a-9df9-4798-a184-af395ffa6b2e\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.330712 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-svc\") pod \"98ff257a-9df9-4798-a184-af395ffa6b2e\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.331065 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-sb\") pod \"98ff257a-9df9-4798-a184-af395ffa6b2e\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.331230 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-config\") pod \"98ff257a-9df9-4798-a184-af395ffa6b2e\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.331332 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-swift-storage-0\") pod \"98ff257a-9df9-4798-a184-af395ffa6b2e\" (UID: \"98ff257a-9df9-4798-a184-af395ffa6b2e\") " Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.429969 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ff257a-9df9-4798-a184-af395ffa6b2e-kube-api-access-c2qw6" (OuterVolumeSpecName: "kube-api-access-c2qw6") pod "98ff257a-9df9-4798-a184-af395ffa6b2e" (UID: "98ff257a-9df9-4798-a184-af395ffa6b2e"). InnerVolumeSpecName "kube-api-access-c2qw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.435261 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2qw6\" (UniqueName: \"kubernetes.io/projected/98ff257a-9df9-4798-a184-af395ffa6b2e-kube-api-access-c2qw6\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.474541 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98ff257a-9df9-4798-a184-af395ffa6b2e" (UID: "98ff257a-9df9-4798-a184-af395ffa6b2e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.491349 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98ff257a-9df9-4798-a184-af395ffa6b2e" (UID: "98ff257a-9df9-4798-a184-af395ffa6b2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.551471 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.551512 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.556349 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98ff257a-9df9-4798-a184-af395ffa6b2e" (UID: "98ff257a-9df9-4798-a184-af395ffa6b2e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.556342 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "98ff257a-9df9-4798-a184-af395ffa6b2e" (UID: "98ff257a-9df9-4798-a184-af395ffa6b2e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.557713 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-config" (OuterVolumeSpecName: "config") pod "98ff257a-9df9-4798-a184-af395ffa6b2e" (UID: "98ff257a-9df9-4798-a184-af395ffa6b2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.600170 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" event={"ID":"98ff257a-9df9-4798-a184-af395ffa6b2e","Type":"ContainerDied","Data":"958f6a08a03c7385694911656d7a65da3ac4e3a319aab4072e30a14721aa5706"} Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.600260 4702 scope.go:117] "RemoveContainer" containerID="9b2a7731315c12ead0871d62417b1235434d873d8ace0324d6010a4e111f5729" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.600513 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-v6tdw" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.679086 4702 scope.go:117] "RemoveContainer" containerID="3a544af6fcdd191c679907823adfee433ea7caa2a4e5aab4cecaad6343016404" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.682453 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.682509 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.682523 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98ff257a-9df9-4798-a184-af395ffa6b2e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.696357 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-v6tdw"] Dec 03 11:30:25 crc kubenswrapper[4702]: I1203 11:30:25.712849 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-v6tdw"] Dec 03 11:30:26 crc kubenswrapper[4702]: I1203 11:30:26.591286 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:26 crc kubenswrapper[4702]: I1203 11:30:26.836997 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:26 crc kubenswrapper[4702]: I1203 11:30:26.959561 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98ff257a-9df9-4798-a184-af395ffa6b2e" path="/var/lib/kubelet/pods/98ff257a-9df9-4798-a184-af395ffa6b2e/volumes" Dec 03 11:30:27 crc kubenswrapper[4702]: I1203 11:30:27.109892 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:27 crc kubenswrapper[4702]: I1203 11:30:27.397569 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:30:28 crc kubenswrapper[4702]: I1203 11:30:28.982800 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-76f68f8c78-w8n8j" Dec 03 11:30:29 crc kubenswrapper[4702]: I1203 11:30:29.116730 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b7b9cbdb8-zhtqs"] Dec 03 11:30:29 crc kubenswrapper[4702]: I1203 11:30:29.117733 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api" containerID="cri-o://f7fdbb297c180e900c4327579dff7f41cee5b9ecf43d664721835cefad8ac2c5" gracePeriod=30 Dec 03 11:30:29 crc kubenswrapper[4702]: I1203 11:30:29.121052 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api-log" containerID="cri-o://4fe6b1487be4a5eba9dca6dee794022a1a9cc9fde36dcae09d158e87e5f7bc03" gracePeriod=30 Dec 03 11:30:29 crc kubenswrapper[4702]: I1203 11:30:29.135158 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.200:9311/healthcheck\": EOF" Dec 03 11:30:29 crc kubenswrapper[4702]: I1203 11:30:29.135258 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.200:9311/healthcheck\": EOF" Dec 03 11:30:29 crc kubenswrapper[4702]: I1203 11:30:29.135355 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.200:9311/healthcheck\": EOF" Dec 03 11:30:29 crc kubenswrapper[4702]: I1203 11:30:29.690861 4702 generic.go:334] "Generic (PLEG): container finished" podID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerID="4fe6b1487be4a5eba9dca6dee794022a1a9cc9fde36dcae09d158e87e5f7bc03" exitCode=143 Dec 03 11:30:29 crc kubenswrapper[4702]: I1203 11:30:29.690956 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" event={"ID":"c82952f3-6193-4d1d-8d66-c1e9624cf937","Type":"ContainerDied","Data":"4fe6b1487be4a5eba9dca6dee794022a1a9cc9fde36dcae09d158e87e5f7bc03"} Dec 03 11:30:31 crc kubenswrapper[4702]: I1203 11:30:31.797683 4702 generic.go:334] "Generic (PLEG): container finished" podID="6c60c306-2c56-44e4-8482-e5a72eccd765" containerID="e8d57bec2cdcf4deaf9da6d8b84928fdde30e921d3658e1bfe81705e6a8659b4" exitCode=0 Dec 03 11:30:31 crc kubenswrapper[4702]: I1203 11:30:31.797939 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wjjt9" event={"ID":"6c60c306-2c56-44e4-8482-e5a72eccd765","Type":"ContainerDied","Data":"e8d57bec2cdcf4deaf9da6d8b84928fdde30e921d3658e1bfe81705e6a8659b4"} Dec 03 11:30:32 crc kubenswrapper[4702]: I1203 11:30:32.475827 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5fdfb45b77-sfz9f" Dec 03 11:30:32 crc kubenswrapper[4702]: I1203 11:30:32.712720 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cb45ddb68-kntb7"] Dec 03 11:30:32 crc kubenswrapper[4702]: I1203 11:30:32.713603 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cb45ddb68-kntb7" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerName="neutron-httpd" containerID="cri-o://ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170" gracePeriod=30 Dec 03 11:30:32 crc kubenswrapper[4702]: I1203 11:30:32.713225 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5cb45ddb68-kntb7" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerName="neutron-api" containerID="cri-o://c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222" gracePeriod=30 Dec 03 11:30:32 crc kubenswrapper[4702]: I1203 11:30:32.815085 4702 generic.go:334] "Generic (PLEG): container finished" podID="20bf2147-401b-457b-ad27-3c893be5fa2c" containerID="39d9a6c7bef6a87823d2e98dc4833752629526b9bfd845df0aa453d7de14260f" exitCode=0 Dec 03 11:30:32 crc kubenswrapper[4702]: I1203 11:30:32.815282 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tqs2b" event={"ID":"20bf2147-401b-457b-ad27-3c893be5fa2c","Type":"ContainerDied","Data":"39d9a6c7bef6a87823d2e98dc4833752629526b9bfd845df0aa453d7de14260f"} Dec 03 11:30:33 crc kubenswrapper[4702]: I1203 11:30:33.851151 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-57f75d96b4-7bvsq" Dec 03 11:30:33 crc kubenswrapper[4702]: I1203 11:30:33.857102 4702 generic.go:334] "Generic (PLEG): container finished" podID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerID="ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170" exitCode=0 Dec 03 11:30:33 crc kubenswrapper[4702]: I1203 11:30:33.857305 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb45ddb68-kntb7" event={"ID":"f2c41a10-02a8-438d-a25c-18e9caf9e467","Type":"ContainerDied","Data":"ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170"} Dec 03 11:30:34 crc kubenswrapper[4702]: I1203 11:30:34.200115 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.200:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:30:34 crc kubenswrapper[4702]: I1203 11:30:34.200115 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.200:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:30:34 crc kubenswrapper[4702]: I1203 11:30:34.554181 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.200:9311/healthcheck\": read tcp 10.217.0.2:43168->10.217.0.200:9311: read: connection reset by peer" Dec 03 11:30:34 crc kubenswrapper[4702]: I1203 11:30:34.554203 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.200:9311/healthcheck\": read tcp 10.217.0.2:43160->10.217.0.200:9311: read: connection reset by peer" Dec 03 11:30:34 crc kubenswrapper[4702]: I1203 11:30:34.554351 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:34 crc kubenswrapper[4702]: I1203 11:30:34.931012 4702 generic.go:334] "Generic (PLEG): container finished" podID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerID="f7fdbb297c180e900c4327579dff7f41cee5b9ecf43d664721835cefad8ac2c5" exitCode=0 Dec 03 11:30:34 crc kubenswrapper[4702]: I1203 11:30:34.948155 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" event={"ID":"c82952f3-6193-4d1d-8d66-c1e9624cf937","Type":"ContainerDied","Data":"f7fdbb297c180e900c4327579dff7f41cee5b9ecf43d664721835cefad8ac2c5"} Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.059550 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wjjt9" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.217164 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5qhx\" (UniqueName: \"kubernetes.io/projected/6c60c306-2c56-44e4-8482-e5a72eccd765-kube-api-access-n5qhx\") pod \"6c60c306-2c56-44e4-8482-e5a72eccd765\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.217241 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-combined-ca-bundle\") pod \"6c60c306-2c56-44e4-8482-e5a72eccd765\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.217367 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-config-data\") pod \"6c60c306-2c56-44e4-8482-e5a72eccd765\" (UID: \"6c60c306-2c56-44e4-8482-e5a72eccd765\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.247092 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c60c306-2c56-44e4-8482-e5a72eccd765-kube-api-access-n5qhx" (OuterVolumeSpecName: "kube-api-access-n5qhx") pod "6c60c306-2c56-44e4-8482-e5a72eccd765" (UID: "6c60c306-2c56-44e4-8482-e5a72eccd765"). InnerVolumeSpecName "kube-api-access-n5qhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.278142 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c60c306-2c56-44e4-8482-e5a72eccd765" (UID: "6c60c306-2c56-44e4-8482-e5a72eccd765"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.320460 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5qhx\" (UniqueName: \"kubernetes.io/projected/6c60c306-2c56-44e4-8482-e5a72eccd765-kube-api-access-n5qhx\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.320515 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.352424 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.389743 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.394967 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-config-data" (OuterVolumeSpecName: "config-data") pod "6c60c306-2c56-44e4-8482-e5a72eccd765" (UID: "6c60c306-2c56-44e4-8482-e5a72eccd765"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.422516 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c60c306-2c56-44e4-8482-e5a72eccd765-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: E1203 11:30:35.483343 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.523867 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20bf2147-401b-457b-ad27-3c893be5fa2c-etc-machine-id\") pod \"20bf2147-401b-457b-ad27-3c893be5fa2c\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.523942 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data-custom\") pod \"c82952f3-6193-4d1d-8d66-c1e9624cf937\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524079 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb4xr\" (UniqueName: \"kubernetes.io/projected/c82952f3-6193-4d1d-8d66-c1e9624cf937-kube-api-access-vb4xr\") pod \"c82952f3-6193-4d1d-8d66-c1e9624cf937\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524144 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-combined-ca-bundle\") pod \"20bf2147-401b-457b-ad27-3c893be5fa2c\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524176 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p58fb\" (UniqueName: \"kubernetes.io/projected/20bf2147-401b-457b-ad27-3c893be5fa2c-kube-api-access-p58fb\") pod \"20bf2147-401b-457b-ad27-3c893be5fa2c\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524293 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-combined-ca-bundle\") pod \"c82952f3-6193-4d1d-8d66-c1e9624cf937\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524395 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-config-data\") pod \"20bf2147-401b-457b-ad27-3c893be5fa2c\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524420 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82952f3-6193-4d1d-8d66-c1e9624cf937-logs\") pod \"c82952f3-6193-4d1d-8d66-c1e9624cf937\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524454 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-scripts\") pod \"20bf2147-401b-457b-ad27-3c893be5fa2c\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524475 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data\") pod \"c82952f3-6193-4d1d-8d66-c1e9624cf937\" (UID: \"c82952f3-6193-4d1d-8d66-c1e9624cf937\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524492 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-db-sync-config-data\") pod \"20bf2147-401b-457b-ad27-3c893be5fa2c\" (UID: \"20bf2147-401b-457b-ad27-3c893be5fa2c\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.524867 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20bf2147-401b-457b-ad27-3c893be5fa2c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "20bf2147-401b-457b-ad27-3c893be5fa2c" (UID: "20bf2147-401b-457b-ad27-3c893be5fa2c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.525036 4702 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/20bf2147-401b-457b-ad27-3c893be5fa2c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.525775 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c82952f3-6193-4d1d-8d66-c1e9624cf937-logs" (OuterVolumeSpecName: "logs") pod "c82952f3-6193-4d1d-8d66-c1e9624cf937" (UID: "c82952f3-6193-4d1d-8d66-c1e9624cf937"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.537866 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-scripts" (OuterVolumeSpecName: "scripts") pod "20bf2147-401b-457b-ad27-3c893be5fa2c" (UID: "20bf2147-401b-457b-ad27-3c893be5fa2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.537928 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "20bf2147-401b-457b-ad27-3c893be5fa2c" (UID: "20bf2147-401b-457b-ad27-3c893be5fa2c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.538030 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c82952f3-6193-4d1d-8d66-c1e9624cf937-kube-api-access-vb4xr" (OuterVolumeSpecName: "kube-api-access-vb4xr") pod "c82952f3-6193-4d1d-8d66-c1e9624cf937" (UID: "c82952f3-6193-4d1d-8d66-c1e9624cf937"). InnerVolumeSpecName "kube-api-access-vb4xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.538630 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c82952f3-6193-4d1d-8d66-c1e9624cf937" (UID: "c82952f3-6193-4d1d-8d66-c1e9624cf937"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.539036 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20bf2147-401b-457b-ad27-3c893be5fa2c-kube-api-access-p58fb" (OuterVolumeSpecName: "kube-api-access-p58fb") pod "20bf2147-401b-457b-ad27-3c893be5fa2c" (UID: "20bf2147-401b-457b-ad27-3c893be5fa2c"). InnerVolumeSpecName "kube-api-access-p58fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.567965 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c82952f3-6193-4d1d-8d66-c1e9624cf937" (UID: "c82952f3-6193-4d1d-8d66-c1e9624cf937"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.568086 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20bf2147-401b-457b-ad27-3c893be5fa2c" (UID: "20bf2147-401b-457b-ad27-3c893be5fa2c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.599381 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-config-data" (OuterVolumeSpecName: "config-data") pod "20bf2147-401b-457b-ad27-3c893be5fa2c" (UID: "20bf2147-401b-457b-ad27-3c893be5fa2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.614096 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data" (OuterVolumeSpecName: "config-data") pod "c82952f3-6193-4d1d-8d66-c1e9624cf937" (UID: "c82952f3-6193-4d1d-8d66-c1e9624cf937"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627109 4702 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627167 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627182 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb4xr\" (UniqueName: \"kubernetes.io/projected/c82952f3-6193-4d1d-8d66-c1e9624cf937-kube-api-access-vb4xr\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627198 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627216 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p58fb\" (UniqueName: \"kubernetes.io/projected/20bf2147-401b-457b-ad27-3c893be5fa2c-kube-api-access-p58fb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627228 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627241 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627253 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c82952f3-6193-4d1d-8d66-c1e9624cf937-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627265 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/20bf2147-401b-457b-ad27-3c893be5fa2c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.627279 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c82952f3-6193-4d1d-8d66-c1e9624cf937-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.628886 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.901815 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-httpd-config\") pod \"f2c41a10-02a8-438d-a25c-18e9caf9e467\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.902325 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-ovndb-tls-certs\") pod \"f2c41a10-02a8-438d-a25c-18e9caf9e467\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.902536 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-config\") pod \"f2c41a10-02a8-438d-a25c-18e9caf9e467\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.902609 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdvcn\" (UniqueName: \"kubernetes.io/projected/f2c41a10-02a8-438d-a25c-18e9caf9e467-kube-api-access-cdvcn\") pod \"f2c41a10-02a8-438d-a25c-18e9caf9e467\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.902789 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-combined-ca-bundle\") pod \"f2c41a10-02a8-438d-a25c-18e9caf9e467\" (UID: \"f2c41a10-02a8-438d-a25c-18e9caf9e467\") " Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.908563 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c41a10-02a8-438d-a25c-18e9caf9e467-kube-api-access-cdvcn" (OuterVolumeSpecName: "kube-api-access-cdvcn") pod "f2c41a10-02a8-438d-a25c-18e9caf9e467" (UID: "f2c41a10-02a8-438d-a25c-18e9caf9e467"). InnerVolumeSpecName "kube-api-access-cdvcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.923104 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f2c41a10-02a8-438d-a25c-18e9caf9e467" (UID: "f2c41a10-02a8-438d-a25c-18e9caf9e467"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.952820 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-wjjt9" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.953219 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-wjjt9" event={"ID":"6c60c306-2c56-44e4-8482-e5a72eccd765","Type":"ContainerDied","Data":"472dee51e6accbe376233fc11179ab4ff3110e08a36312906073772b1904a0bc"} Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.953264 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="472dee51e6accbe376233fc11179ab4ff3110e08a36312906073772b1904a0bc" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.962822 4702 generic.go:334] "Generic (PLEG): container finished" podID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerID="c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222" exitCode=0 Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.962899 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb45ddb68-kntb7" event={"ID":"f2c41a10-02a8-438d-a25c-18e9caf9e467","Type":"ContainerDied","Data":"c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222"} Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.962934 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5cb45ddb68-kntb7" event={"ID":"f2c41a10-02a8-438d-a25c-18e9caf9e467","Type":"ContainerDied","Data":"45d29372fbac5132dbc482d007a9447553f248f8814ef5dbbbce162dfede3ace"} Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.962955 4702 scope.go:117] "RemoveContainer" containerID="ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.963104 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5cb45ddb68-kntb7" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.968979 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-tqs2b" event={"ID":"20bf2147-401b-457b-ad27-3c893be5fa2c","Type":"ContainerDied","Data":"1959755cbf3e2aa23fb6f744b6a0b0a7d3c175ab7c17d5fa38f5c8beb96ccf63"} Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.969075 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1959755cbf3e2aa23fb6f744b6a0b0a7d3c175ab7c17d5fa38f5c8beb96ccf63" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.969146 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-tqs2b" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.975167 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2c41a10-02a8-438d-a25c-18e9caf9e467" (UID: "f2c41a10-02a8-438d-a25c-18e9caf9e467"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.978882 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b53ad8d-06e2-4511-b33a-2ff6c6209861","Type":"ContainerStarted","Data":"be26f30c9ddcbbe0e06cfc7ef5cb52ed3999f1d5da0091c11c86f16ac0a664cf"} Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.979148 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="ceilometer-notification-agent" containerID="cri-o://c11aaf3fec02583298058b1bab0d56c335f1b17f97e32174a1c075be4b5f5565" gracePeriod=30 Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.979460 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.979843 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="sg-core" containerID="cri-o://1d9705dc7dd730c89e59d77d48a0ddb1fa2e1afb386ee1a13032d2f8e31394c4" gracePeriod=30 Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.979886 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="proxy-httpd" containerID="cri-o://be26f30c9ddcbbe0e06cfc7ef5cb52ed3999f1d5da0091c11c86f16ac0a664cf" gracePeriod=30 Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.987974 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" event={"ID":"c82952f3-6193-4d1d-8d66-c1e9624cf937","Type":"ContainerDied","Data":"7f7d340fe5d6dbbb79d628b34ad53e1f48d4d8310f5d0469dad57af9f85172d4"} Dec 03 11:30:35 crc kubenswrapper[4702]: I1203 11:30:35.988082 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b7b9cbdb8-zhtqs" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.010499 4702 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.011497 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdvcn\" (UniqueName: \"kubernetes.io/projected/f2c41a10-02a8-438d-a25c-18e9caf9e467-kube-api-access-cdvcn\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.011709 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.041309 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-config" (OuterVolumeSpecName: "config") pod "f2c41a10-02a8-438d-a25c-18e9caf9e467" (UID: "f2c41a10-02a8-438d-a25c-18e9caf9e467"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.065274 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f2c41a10-02a8-438d-a25c-18e9caf9e467" (UID: "f2c41a10-02a8-438d-a25c-18e9caf9e467"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.117440 4702 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.117497 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f2c41a10-02a8-438d-a25c-18e9caf9e467-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.230373 4702 scope.go:117] "RemoveContainer" containerID="c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.249370 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5b7b9cbdb8-zhtqs"] Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.263649 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5b7b9cbdb8-zhtqs"] Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.264055 4702 scope.go:117] "RemoveContainer" containerID="ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.264491 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170\": container with ID starting with ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170 not found: ID does not exist" containerID="ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.264535 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170"} err="failed to get container status \"ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170\": rpc error: code = NotFound desc = could not find container \"ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170\": container with ID starting with ac38f93f54d90523cb6c7873246b6cf76b3022742dfc987aa365094414358170 not found: ID does not exist" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.264570 4702 scope.go:117] "RemoveContainer" containerID="c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.265059 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222\": container with ID starting with c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222 not found: ID does not exist" containerID="c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.265082 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222"} err="failed to get container status \"c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222\": rpc error: code = NotFound desc = could not find container \"c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222\": container with ID starting with c28f58b16fd4d0ede2deb30f8756514b98f859a958031c1f46703601cf373222 not found: ID does not exist" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.265096 4702 scope.go:117] "RemoveContainer" containerID="f7fdbb297c180e900c4327579dff7f41cee5b9ecf43d664721835cefad8ac2c5" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.298877 4702 scope.go:117] "RemoveContainer" containerID="4fe6b1487be4a5eba9dca6dee794022a1a9cc9fde36dcae09d158e87e5f7bc03" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.317748 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5cb45ddb68-kntb7"] Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.330799 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5cb45ddb68-kntb7"] Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.698825 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.699401 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20bf2147-401b-457b-ad27-3c893be5fa2c" containerName="cinder-db-sync" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699419 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="20bf2147-401b-457b-ad27-3c893be5fa2c" containerName="cinder-db-sync" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.699434 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ff257a-9df9-4798-a184-af395ffa6b2e" containerName="init" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699440 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ff257a-9df9-4798-a184-af395ffa6b2e" containerName="init" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.699450 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerName="neutron-api" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699457 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerName="neutron-api" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.699477 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerName="neutron-httpd" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699483 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerName="neutron-httpd" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.699492 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c60c306-2c56-44e4-8482-e5a72eccd765" containerName="heat-db-sync" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699498 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c60c306-2c56-44e4-8482-e5a72eccd765" containerName="heat-db-sync" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.699509 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ff257a-9df9-4798-a184-af395ffa6b2e" containerName="dnsmasq-dns" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699514 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ff257a-9df9-4798-a184-af395ffa6b2e" containerName="dnsmasq-dns" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.699540 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699546 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api" Dec 03 11:30:36 crc kubenswrapper[4702]: E1203 11:30:36.699588 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api-log" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699593 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api-log" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699843 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerName="neutron-api" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699860 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="20bf2147-401b-457b-ad27-3c893be5fa2c" containerName="cinder-db-sync" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699910 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" containerName="neutron-httpd" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699920 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api-log" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699941 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c60c306-2c56-44e4-8482-e5a72eccd765" containerName="heat-db-sync" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699949 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" containerName="barbican-api" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.699959 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="98ff257a-9df9-4798-a184-af395ffa6b2e" containerName="dnsmasq-dns" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.701262 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.708034 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.711277 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gt7kh" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.717212 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.717469 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.723841 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.745212 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.745357 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.745399 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64j9t\" (UniqueName: \"kubernetes.io/projected/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-kube-api-access-64j9t\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.745457 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.745605 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-scripts\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.745637 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.783693 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-kmfbx"] Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.787642 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.847806 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwdfz\" (UniqueName: \"kubernetes.io/projected/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-kube-api-access-kwdfz\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848243 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848354 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848406 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848449 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848482 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64j9t\" (UniqueName: \"kubernetes.io/projected/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-kube-api-access-64j9t\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848529 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848585 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848640 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848676 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-scripts\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848710 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.848737 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-config\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.851049 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.851210 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-kmfbx"] Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.856534 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.856801 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.857045 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.866073 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-scripts\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.870052 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.872360 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.875688 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.884024 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64j9t\" (UniqueName: \"kubernetes.io/projected/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-kube-api-access-64j9t\") pod \"cinder-scheduler-0\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.949925 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.950026 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-config\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.950151 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwdfz\" (UniqueName: \"kubernetes.io/projected/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-kube-api-access-kwdfz\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.950222 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.950342 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.950426 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.951588 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.952636 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.952735 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-config\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.953014 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.953868 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.956498 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c82952f3-6193-4d1d-8d66-c1e9624cf937" path="/var/lib/kubelet/pods/c82952f3-6193-4d1d-8d66-c1e9624cf937/volumes" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.959310 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2c41a10-02a8-438d-a25c-18e9caf9e467" path="/var/lib/kubelet/pods/f2c41a10-02a8-438d-a25c-18e9caf9e467/volumes" Dec 03 11:30:36 crc kubenswrapper[4702]: I1203 11:30:36.978630 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwdfz\" (UniqueName: \"kubernetes.io/projected/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-kube-api-access-kwdfz\") pod \"dnsmasq-dns-5c9776ccc5-kmfbx\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.006370 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.009046 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.012949 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.022635 4702 generic.go:334] "Generic (PLEG): container finished" podID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerID="be26f30c9ddcbbe0e06cfc7ef5cb52ed3999f1d5da0091c11c86f16ac0a664cf" exitCode=0 Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.022701 4702 generic.go:334] "Generic (PLEG): container finished" podID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerID="1d9705dc7dd730c89e59d77d48a0ddb1fa2e1afb386ee1a13032d2f8e31394c4" exitCode=2 Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.022776 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b53ad8d-06e2-4511-b33a-2ff6c6209861","Type":"ContainerDied","Data":"be26f30c9ddcbbe0e06cfc7ef5cb52ed3999f1d5da0091c11c86f16ac0a664cf"} Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.022811 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b53ad8d-06e2-4511-b33a-2ff6c6209861","Type":"ContainerDied","Data":"1d9705dc7dd730c89e59d77d48a0ddb1fa2e1afb386ee1a13032d2f8e31394c4"} Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.034408 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.038386 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-gt7kh" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.047695 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.052735 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.052817 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.052937 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/911b822c-88c9-476e-a204-1d43d017ecc3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.052967 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjlkc\" (UniqueName: \"kubernetes.io/projected/911b822c-88c9-476e-a204-1d43d017ecc3-kube-api-access-vjlkc\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.052997 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data-custom\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.053068 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911b822c-88c9-476e-a204-1d43d017ecc3-logs\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.053195 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-scripts\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.135890 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.156872 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911b822c-88c9-476e-a204-1d43d017ecc3-logs\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.157031 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-scripts\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.157109 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.157142 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.157249 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/911b822c-88c9-476e-a204-1d43d017ecc3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.157288 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjlkc\" (UniqueName: \"kubernetes.io/projected/911b822c-88c9-476e-a204-1d43d017ecc3-kube-api-access-vjlkc\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.157313 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data-custom\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.158580 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911b822c-88c9-476e-a204-1d43d017ecc3-logs\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.159709 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/911b822c-88c9-476e-a204-1d43d017ecc3-etc-machine-id\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.162224 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data-custom\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.170241 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-scripts\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.170797 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.172998 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.181562 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjlkc\" (UniqueName: \"kubernetes.io/projected/911b822c-88c9-476e-a204-1d43d017ecc3-kube-api-access-vjlkc\") pod \"cinder-api-0\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.352114 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 11:30:37 crc kubenswrapper[4702]: W1203 11:30:37.880727 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6c7eaad_7121_41a5_b4b6_ea5e079ff1c5.slice/crio-9fddd6a921c141ab679a9e86abc0e17e535bb64a23e2d3297bebdfd0eb06753b WatchSource:0}: Error finding container 9fddd6a921c141ab679a9e86abc0e17e535bb64a23e2d3297bebdfd0eb06753b: Status 404 returned error can't find the container with id 9fddd6a921c141ab679a9e86abc0e17e535bb64a23e2d3297bebdfd0eb06753b Dec 03 11:30:37 crc kubenswrapper[4702]: W1203 11:30:37.883175 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod479db84e_8b1b_4bcb_8711_a5a51ca22eb1.slice/crio-d30bca3a7863e8b6ad0b942c2c454ded3aaf6f77f7e5d8cf235d537434ab8b98 WatchSource:0}: Error finding container d30bca3a7863e8b6ad0b942c2c454ded3aaf6f77f7e5d8cf235d537434ab8b98: Status 404 returned error can't find the container with id d30bca3a7863e8b6ad0b942c2c454ded3aaf6f77f7e5d8cf235d537434ab8b98 Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.884785 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-kmfbx"] Dec 03 11:30:37 crc kubenswrapper[4702]: I1203 11:30:37.895497 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:38 crc kubenswrapper[4702]: I1203 11:30:38.118076 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" event={"ID":"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5","Type":"ContainerStarted","Data":"9fddd6a921c141ab679a9e86abc0e17e535bb64a23e2d3297bebdfd0eb06753b"} Dec 03 11:30:38 crc kubenswrapper[4702]: I1203 11:30:38.123239 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"479db84e-8b1b-4bcb-8711-a5a51ca22eb1","Type":"ContainerStarted","Data":"d30bca3a7863e8b6ad0b942c2c454ded3aaf6f77f7e5d8cf235d537434ab8b98"} Dec 03 11:30:38 crc kubenswrapper[4702]: I1203 11:30:38.150972 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:39 crc kubenswrapper[4702]: I1203 11:30:39.154372 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"911b822c-88c9-476e-a204-1d43d017ecc3","Type":"ContainerStarted","Data":"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8"} Dec 03 11:30:39 crc kubenswrapper[4702]: I1203 11:30:39.155083 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"911b822c-88c9-476e-a204-1d43d017ecc3","Type":"ContainerStarted","Data":"58ae58cb2e7f2789a7e6165ca524234417d727993309213b877c47ad2e6be067"} Dec 03 11:30:39 crc kubenswrapper[4702]: I1203 11:30:39.156845 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" event={"ID":"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5","Type":"ContainerDied","Data":"d341171d8d95e0543b60ebce4ab006cf17df2ddb717cb5de82de7daa8cd256f9"} Dec 03 11:30:39 crc kubenswrapper[4702]: I1203 11:30:39.156838 4702 generic.go:334] "Generic (PLEG): container finished" podID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerID="d341171d8d95e0543b60ebce4ab006cf17df2ddb717cb5de82de7daa8cd256f9" exitCode=0 Dec 03 11:30:39 crc kubenswrapper[4702]: I1203 11:30:39.904019 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:40 crc kubenswrapper[4702]: I1203 11:30:40.173948 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" event={"ID":"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5","Type":"ContainerStarted","Data":"c309e395ef4019bf0f57f59b85129ecab952875892c0541a9d9208601fddf934"} Dec 03 11:30:40 crc kubenswrapper[4702]: I1203 11:30:40.174075 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:40 crc kubenswrapper[4702]: I1203 11:30:40.179604 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"911b822c-88c9-476e-a204-1d43d017ecc3","Type":"ContainerStarted","Data":"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603"} Dec 03 11:30:40 crc kubenswrapper[4702]: I1203 11:30:40.179696 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 11:30:40 crc kubenswrapper[4702]: I1203 11:30:40.200033 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"479db84e-8b1b-4bcb-8711-a5a51ca22eb1","Type":"ContainerStarted","Data":"fb5ae7c56fd334233f4fc58e0ec3338cfba7244ca7d32a8687111414888d7ea6"} Dec 03 11:30:40 crc kubenswrapper[4702]: I1203 11:30:40.241541 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" podStartSLOduration=4.241507657 podStartE2EDuration="4.241507657s" podCreationTimestamp="2025-12-03 11:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:40.219930466 +0000 UTC m=+1624.055858940" watchObservedRunningTime="2025-12-03 11:30:40.241507657 +0000 UTC m=+1624.077436121" Dec 03 11:30:40 crc kubenswrapper[4702]: I1203 11:30:40.283927 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.283896978 podStartE2EDuration="4.283896978s" podCreationTimestamp="2025-12-03 11:30:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:40.261474523 +0000 UTC m=+1624.097402987" watchObservedRunningTime="2025-12-03 11:30:40.283896978 +0000 UTC m=+1624.119825442" Dec 03 11:30:41 crc kubenswrapper[4702]: I1203 11:30:41.231456 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"479db84e-8b1b-4bcb-8711-a5a51ca22eb1","Type":"ContainerStarted","Data":"42955700812f9623178ccbc92caf41631a43361aa5a01a4991220b92ab294eca"} Dec 03 11:30:41 crc kubenswrapper[4702]: I1203 11:30:41.234739 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" containerName="cinder-api-log" containerID="cri-o://17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8" gracePeriod=30 Dec 03 11:30:41 crc kubenswrapper[4702]: I1203 11:30:41.235254 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" containerName="cinder-api" containerID="cri-o://7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603" gracePeriod=30 Dec 03 11:30:41 crc kubenswrapper[4702]: I1203 11:30:41.266817 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.305700755 podStartE2EDuration="5.266784047s" podCreationTimestamp="2025-12-03 11:30:36 +0000 UTC" firstStartedPulling="2025-12-03 11:30:37.885298299 +0000 UTC m=+1621.721226763" lastFinishedPulling="2025-12-03 11:30:38.846381581 +0000 UTC m=+1622.682310055" observedRunningTime="2025-12-03 11:30:41.259944393 +0000 UTC m=+1625.095872867" watchObservedRunningTime="2025-12-03 11:30:41.266784047 +0000 UTC m=+1625.102712511" Dec 03 11:30:41 crc kubenswrapper[4702]: I1203 11:30:41.946426 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.031572 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-scripts\") pod \"911b822c-88c9-476e-a204-1d43d017ecc3\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.031700 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjlkc\" (UniqueName: \"kubernetes.io/projected/911b822c-88c9-476e-a204-1d43d017ecc3-kube-api-access-vjlkc\") pod \"911b822c-88c9-476e-a204-1d43d017ecc3\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.031835 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911b822c-88c9-476e-a204-1d43d017ecc3-logs\") pod \"911b822c-88c9-476e-a204-1d43d017ecc3\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.032107 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-combined-ca-bundle\") pod \"911b822c-88c9-476e-a204-1d43d017ecc3\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.032161 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/911b822c-88c9-476e-a204-1d43d017ecc3-etc-machine-id\") pod \"911b822c-88c9-476e-a204-1d43d017ecc3\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.032208 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data-custom\") pod \"911b822c-88c9-476e-a204-1d43d017ecc3\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.032252 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data\") pod \"911b822c-88c9-476e-a204-1d43d017ecc3\" (UID: \"911b822c-88c9-476e-a204-1d43d017ecc3\") " Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.034932 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/911b822c-88c9-476e-a204-1d43d017ecc3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "911b822c-88c9-476e-a204-1d43d017ecc3" (UID: "911b822c-88c9-476e-a204-1d43d017ecc3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.035622 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/911b822c-88c9-476e-a204-1d43d017ecc3-logs" (OuterVolumeSpecName: "logs") pod "911b822c-88c9-476e-a204-1d43d017ecc3" (UID: "911b822c-88c9-476e-a204-1d43d017ecc3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.040168 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911b822c-88c9-476e-a204-1d43d017ecc3-kube-api-access-vjlkc" (OuterVolumeSpecName: "kube-api-access-vjlkc") pod "911b822c-88c9-476e-a204-1d43d017ecc3" (UID: "911b822c-88c9-476e-a204-1d43d017ecc3"). InnerVolumeSpecName "kube-api-access-vjlkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.040994 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "911b822c-88c9-476e-a204-1d43d017ecc3" (UID: "911b822c-88c9-476e-a204-1d43d017ecc3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.041118 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-scripts" (OuterVolumeSpecName: "scripts") pod "911b822c-88c9-476e-a204-1d43d017ecc3" (UID: "911b822c-88c9-476e-a204-1d43d017ecc3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.048126 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.070547 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "911b822c-88c9-476e-a204-1d43d017ecc3" (UID: "911b822c-88c9-476e-a204-1d43d017ecc3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.113874 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data" (OuterVolumeSpecName: "config-data") pod "911b822c-88c9-476e-a204-1d43d017ecc3" (UID: "911b822c-88c9-476e-a204-1d43d017ecc3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.135185 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.137664 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.137876 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.139001 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjlkc\" (UniqueName: \"kubernetes.io/projected/911b822c-88c9-476e-a204-1d43d017ecc3-kube-api-access-vjlkc\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.139124 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/911b822c-88c9-476e-a204-1d43d017ecc3-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.139215 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/911b822c-88c9-476e-a204-1d43d017ecc3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.141329 4702 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/911b822c-88c9-476e-a204-1d43d017ecc3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.246297 4702 generic.go:334] "Generic (PLEG): container finished" podID="911b822c-88c9-476e-a204-1d43d017ecc3" containerID="7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603" exitCode=0 Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.246357 4702 generic.go:334] "Generic (PLEG): container finished" podID="911b822c-88c9-476e-a204-1d43d017ecc3" containerID="17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8" exitCode=143 Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.246446 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.246519 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"911b822c-88c9-476e-a204-1d43d017ecc3","Type":"ContainerDied","Data":"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603"} Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.246580 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"911b822c-88c9-476e-a204-1d43d017ecc3","Type":"ContainerDied","Data":"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8"} Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.246604 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"911b822c-88c9-476e-a204-1d43d017ecc3","Type":"ContainerDied","Data":"58ae58cb2e7f2789a7e6165ca524234417d727993309213b877c47ad2e6be067"} Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.246630 4702 scope.go:117] "RemoveContainer" containerID="7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.284716 4702 scope.go:117] "RemoveContainer" containerID="17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.320072 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.323684 4702 scope.go:117] "RemoveContainer" containerID="7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603" Dec 03 11:30:42 crc kubenswrapper[4702]: E1203 11:30:42.324395 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603\": container with ID starting with 7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603 not found: ID does not exist" containerID="7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.324431 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603"} err="failed to get container status \"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603\": rpc error: code = NotFound desc = could not find container \"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603\": container with ID starting with 7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603 not found: ID does not exist" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.324456 4702 scope.go:117] "RemoveContainer" containerID="17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8" Dec 03 11:30:42 crc kubenswrapper[4702]: E1203 11:30:42.326367 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8\": container with ID starting with 17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8 not found: ID does not exist" containerID="17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.326429 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8"} err="failed to get container status \"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8\": rpc error: code = NotFound desc = could not find container \"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8\": container with ID starting with 17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8 not found: ID does not exist" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.326459 4702 scope.go:117] "RemoveContainer" containerID="7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.327858 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603"} err="failed to get container status \"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603\": rpc error: code = NotFound desc = could not find container \"7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603\": container with ID starting with 7f19f9da0d7ea7e00edd4db415190a1363148cb0e070cd7d844f10e4ba802603 not found: ID does not exist" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.327890 4702 scope.go:117] "RemoveContainer" containerID="17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.328602 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8"} err="failed to get container status \"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8\": rpc error: code = NotFound desc = could not find container \"17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8\": container with ID starting with 17f116fa740fa5cad6997c67450e17e33b9f8dea177c41e5f35d10125e9d1cd8 not found: ID does not exist" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.348915 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.364055 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:42 crc kubenswrapper[4702]: E1203 11:30:42.364851 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" containerName="cinder-api" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.364876 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" containerName="cinder-api" Dec 03 11:30:42 crc kubenswrapper[4702]: E1203 11:30:42.364995 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" containerName="cinder-api-log" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.365014 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" containerName="cinder-api-log" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.365291 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" containerName="cinder-api-log" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.365317 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" containerName="cinder-api" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.366976 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.369502 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.370812 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.370937 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.372152 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449138 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449259 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-config-data-custom\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449309 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hhh\" (UniqueName: \"kubernetes.io/projected/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-kube-api-access-j2hhh\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449337 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-logs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449406 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-config-data\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449644 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449699 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449852 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.449929 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-scripts\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552239 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hhh\" (UniqueName: \"kubernetes.io/projected/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-kube-api-access-j2hhh\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552290 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-logs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552353 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-config-data\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552403 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552420 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552456 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552491 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-scripts\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552620 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.552675 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-config-data-custom\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.553219 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.553670 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-logs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.556316 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.556646 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.556814 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-scripts\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.557892 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-config-data\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.558103 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-config-data-custom\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.568936 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hhh\" (UniqueName: \"kubernetes.io/projected/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-kube-api-access-j2hhh\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.580142 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e07dc54-499c-470d-9e1b-4775b3ec0ba6-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8e07dc54-499c-470d-9e1b-4775b3ec0ba6\") " pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.746147 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 11:30:42 crc kubenswrapper[4702]: I1203 11:30:42.949644 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911b822c-88c9-476e-a204-1d43d017ecc3" path="/var/lib/kubelet/pods/911b822c-88c9-476e-a204-1d43d017ecc3/volumes" Dec 03 11:30:43 crc kubenswrapper[4702]: I1203 11:30:43.219588 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 11:30:43 crc kubenswrapper[4702]: W1203 11:30:43.225371 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e07dc54_499c_470d_9e1b_4775b3ec0ba6.slice/crio-c4c97bb855f508637bf9f7872af379024137654f25917021af17f5c7ca7c3f0e WatchSource:0}: Error finding container c4c97bb855f508637bf9f7872af379024137654f25917021af17f5c7ca7c3f0e: Status 404 returned error can't find the container with id c4c97bb855f508637bf9f7872af379024137654f25917021af17f5c7ca7c3f0e Dec 03 11:30:43 crc kubenswrapper[4702]: I1203 11:30:43.281505 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8e07dc54-499c-470d-9e1b-4775b3ec0ba6","Type":"ContainerStarted","Data":"c4c97bb855f508637bf9f7872af379024137654f25917021af17f5c7ca7c3f0e"} Dec 03 11:30:44 crc kubenswrapper[4702]: E1203 11:30:44.267251 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b53ad8d_06e2_4511_b33a_2ff6c6209861.slice/crio-conmon-c11aaf3fec02583298058b1bab0d56c335f1b17f97e32174a1c075be4b5f5565.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.308112 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.308953 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8e07dc54-499c-470d-9e1b-4775b3ec0ba6","Type":"ContainerStarted","Data":"4d9a7f033ccc496f0433381e86ce03c612c4561b20c32f7f4e1db904ee7ad418"} Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.313455 4702 generic.go:334] "Generic (PLEG): container finished" podID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerID="c11aaf3fec02583298058b1bab0d56c335f1b17f97e32174a1c075be4b5f5565" exitCode=0 Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.313499 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b53ad8d-06e2-4511-b33a-2ff6c6209861","Type":"ContainerDied","Data":"c11aaf3fec02583298058b1bab0d56c335f1b17f97e32174a1c075be4b5f5565"} Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.313535 4702 scope.go:117] "RemoveContainer" containerID="be26f30c9ddcbbe0e06cfc7ef5cb52ed3999f1d5da0091c11c86f16ac0a664cf" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.313959 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.352561 4702 scope.go:117] "RemoveContainer" containerID="1d9705dc7dd730c89e59d77d48a0ddb1fa2e1afb386ee1a13032d2f8e31394c4" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.402924 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-run-httpd\") pod \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.403029 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-log-httpd\") pod \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.403110 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-sg-core-conf-yaml\") pod \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.403191 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msk6c\" (UniqueName: \"kubernetes.io/projected/4b53ad8d-06e2-4511-b33a-2ff6c6209861-kube-api-access-msk6c\") pod \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.403355 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-config-data\") pod \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.403404 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-combined-ca-bundle\") pod \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.403516 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b53ad8d-06e2-4511-b33a-2ff6c6209861" (UID: "4b53ad8d-06e2-4511-b33a-2ff6c6209861"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.403608 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-scripts\") pod \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\" (UID: \"4b53ad8d-06e2-4511-b33a-2ff6c6209861\") " Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.404261 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b53ad8d-06e2-4511-b33a-2ff6c6209861" (UID: "4b53ad8d-06e2-4511-b33a-2ff6c6209861"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.405315 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.405346 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b53ad8d-06e2-4511-b33a-2ff6c6209861-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.421452 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b53ad8d-06e2-4511-b33a-2ff6c6209861-kube-api-access-msk6c" (OuterVolumeSpecName: "kube-api-access-msk6c") pod "4b53ad8d-06e2-4511-b33a-2ff6c6209861" (UID: "4b53ad8d-06e2-4511-b33a-2ff6c6209861"). InnerVolumeSpecName "kube-api-access-msk6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.422071 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-scripts" (OuterVolumeSpecName: "scripts") pod "4b53ad8d-06e2-4511-b33a-2ff6c6209861" (UID: "4b53ad8d-06e2-4511-b33a-2ff6c6209861"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.441326 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b53ad8d-06e2-4511-b33a-2ff6c6209861" (UID: "4b53ad8d-06e2-4511-b33a-2ff6c6209861"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.477979 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b53ad8d-06e2-4511-b33a-2ff6c6209861" (UID: "4b53ad8d-06e2-4511-b33a-2ff6c6209861"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.507717 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.507785 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.507800 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.507813 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msk6c\" (UniqueName: \"kubernetes.io/projected/4b53ad8d-06e2-4511-b33a-2ff6c6209861-kube-api-access-msk6c\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.511154 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-config-data" (OuterVolumeSpecName: "config-data") pod "4b53ad8d-06e2-4511-b33a-2ff6c6209861" (UID: "4b53ad8d-06e2-4511-b33a-2ff6c6209861"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.610622 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b53ad8d-06e2-4511-b33a-2ff6c6209861-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.653024 4702 scope.go:117] "RemoveContainer" containerID="c11aaf3fec02583298058b1bab0d56c335f1b17f97e32174a1c075be4b5f5565" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.726966 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.742140 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.759324 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:30:44 crc kubenswrapper[4702]: E1203 11:30:44.759997 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="proxy-httpd" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.760020 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="proxy-httpd" Dec 03 11:30:44 crc kubenswrapper[4702]: E1203 11:30:44.760043 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="ceilometer-notification-agent" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.760050 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="ceilometer-notification-agent" Dec 03 11:30:44 crc kubenswrapper[4702]: E1203 11:30:44.760071 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="sg-core" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.760078 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="sg-core" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.760306 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="sg-core" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.760325 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="ceilometer-notification-agent" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.760364 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" containerName="proxy-httpd" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.763067 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.766496 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.766519 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.776868 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.919795 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-run-httpd\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.919919 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.920173 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-scripts\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.920444 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6kf7\" (UniqueName: \"kubernetes.io/projected/dd27c847-d92c-4adb-bd1a-d445ae5bd182-kube-api-access-w6kf7\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.920539 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-log-httpd\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.920656 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.920800 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-config-data\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:44 crc kubenswrapper[4702]: I1203 11:30:44.955284 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b53ad8d-06e2-4511-b33a-2ff6c6209861" path="/var/lib/kubelet/pods/4b53ad8d-06e2-4511-b33a-2ff6c6209861/volumes" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.269359 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.269453 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-config-data\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.269512 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-run-httpd\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.269620 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.269678 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-scripts\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.269743 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6kf7\" (UniqueName: \"kubernetes.io/projected/dd27c847-d92c-4adb-bd1a-d445ae5bd182-kube-api-access-w6kf7\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.273898 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-run-httpd\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.273938 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-log-httpd\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.274725 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-log-httpd\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.293711 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.295126 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-config-data\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.299596 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-scripts\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.308073 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.335380 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6kf7\" (UniqueName: \"kubernetes.io/projected/dd27c847-d92c-4adb-bd1a-d445ae5bd182-kube-api-access-w6kf7\") pod \"ceilometer-0\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.349654 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8e07dc54-499c-470d-9e1b-4775b3ec0ba6","Type":"ContainerStarted","Data":"9259bc3e57bb260ac1eb6ce9815a9e27fb9b95d3b9411da6f99715d42e4bf6f9"} Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.350157 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.385363 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.385334592 podStartE2EDuration="3.385334592s" podCreationTimestamp="2025-12-03 11:30:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:45.371724955 +0000 UTC m=+1629.207653429" watchObservedRunningTime="2025-12-03 11:30:45.385334592 +0000 UTC m=+1629.221263056" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.406289 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:30:45 crc kubenswrapper[4702]: I1203 11:30:45.945626 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:30:46 crc kubenswrapper[4702]: I1203 11:30:46.328614 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-694848776d-v64gd" Dec 03 11:30:46 crc kubenswrapper[4702]: I1203 11:30:46.366424 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerStarted","Data":"4fee63618193cf31971e2d68b64d5cb9e10ac18d57e977cef600e48b797a4495"} Dec 03 11:30:47 crc kubenswrapper[4702]: I1203 11:30:47.137965 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:30:47 crc kubenswrapper[4702]: I1203 11:30:47.212246 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-9kvwl"] Dec 03 11:30:47 crc kubenswrapper[4702]: I1203 11:30:47.212554 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" podUID="3377c36f-51de-4fca-9fe9-a0763ab93e36" containerName="dnsmasq-dns" containerID="cri-o://0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a" gracePeriod=10 Dec 03 11:30:47 crc kubenswrapper[4702]: I1203 11:30:47.614456 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 11:30:47 crc kubenswrapper[4702]: I1203 11:30:47.722514 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerStarted","Data":"1136264585a2d16f4270d48cb3a9e7d82fd90f47ebde7d44b2ff6929d1ef7724"} Dec 03 11:30:47 crc kubenswrapper[4702]: I1203 11:30:47.724562 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.366831 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.536019 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-swift-storage-0\") pod \"3377c36f-51de-4fca-9fe9-a0763ab93e36\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.536212 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-sb\") pod \"3377c36f-51de-4fca-9fe9-a0763ab93e36\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.536290 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-config\") pod \"3377c36f-51de-4fca-9fe9-a0763ab93e36\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.536476 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-svc\") pod \"3377c36f-51de-4fca-9fe9-a0763ab93e36\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.536566 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-nb\") pod \"3377c36f-51de-4fca-9fe9-a0763ab93e36\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.537115 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8db7\" (UniqueName: \"kubernetes.io/projected/3377c36f-51de-4fca-9fe9-a0763ab93e36-kube-api-access-v8db7\") pod \"3377c36f-51de-4fca-9fe9-a0763ab93e36\" (UID: \"3377c36f-51de-4fca-9fe9-a0763ab93e36\") " Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.736132 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3377c36f-51de-4fca-9fe9-a0763ab93e36-kube-api-access-v8db7" (OuterVolumeSpecName: "kube-api-access-v8db7") pod "3377c36f-51de-4fca-9fe9-a0763ab93e36" (UID: "3377c36f-51de-4fca-9fe9-a0763ab93e36"). InnerVolumeSpecName "kube-api-access-v8db7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.756629 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8db7\" (UniqueName: \"kubernetes.io/projected/3377c36f-51de-4fca-9fe9-a0763ab93e36-kube-api-access-v8db7\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.855831 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 11:30:48 crc kubenswrapper[4702]: E1203 11:30:48.856387 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3377c36f-51de-4fca-9fe9-a0763ab93e36" containerName="dnsmasq-dns" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.856408 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3377c36f-51de-4fca-9fe9-a0763ab93e36" containerName="dnsmasq-dns" Dec 03 11:30:48 crc kubenswrapper[4702]: E1203 11:30:48.856438 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3377c36f-51de-4fca-9fe9-a0763ab93e36" containerName="init" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.856447 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3377c36f-51de-4fca-9fe9-a0763ab93e36" containerName="init" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.856725 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="3377c36f-51de-4fca-9fe9-a0763ab93e36" containerName="dnsmasq-dns" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.857620 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.863661 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7hfs8" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.863957 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.864136 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.881038 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3377c36f-51de-4fca-9fe9-a0763ab93e36" (UID: "3377c36f-51de-4fca-9fe9-a0763ab93e36"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.898474 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.940099 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3377c36f-51de-4fca-9fe9-a0763ab93e36" (UID: "3377c36f-51de-4fca-9fe9-a0763ab93e36"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.959194 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3377c36f-51de-4fca-9fe9-a0763ab93e36" (UID: "3377c36f-51de-4fca-9fe9-a0763ab93e36"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.967239 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.967272 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.967286 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.971499 4702 generic.go:334] "Generic (PLEG): container finished" podID="3377c36f-51de-4fca-9fe9-a0763ab93e36" containerID="0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a" exitCode=0 Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.971880 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerName="cinder-scheduler" containerID="cri-o://fb5ae7c56fd334233f4fc58e0ec3338cfba7244ca7d32a8687111414888d7ea6" gracePeriod=30 Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.972271 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.972405 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-config" (OuterVolumeSpecName: "config") pod "3377c36f-51de-4fca-9fe9-a0763ab93e36" (UID: "3377c36f-51de-4fca-9fe9-a0763ab93e36"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.972895 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerName="probe" containerID="cri-o://42955700812f9623178ccbc92caf41631a43361aa5a01a4991220b92ab294eca" gracePeriod=30 Dec 03 11:30:48 crc kubenswrapper[4702]: I1203 11:30:48.997338 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3377c36f-51de-4fca-9fe9-a0763ab93e36" (UID: "3377c36f-51de-4fca-9fe9-a0763ab93e36"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.069190 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b30e6f6-6da6-48ea-8e02-873d566d7719-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.069295 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b30e6f6-6da6-48ea-8e02-873d566d7719-openstack-config\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.069352 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b30e6f6-6da6-48ea-8e02-873d566d7719-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.069447 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ppjz\" (UniqueName: \"kubernetes.io/projected/1b30e6f6-6da6-48ea-8e02-873d566d7719-kube-api-access-7ppjz\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.074949 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.075008 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3377c36f-51de-4fca-9fe9-a0763ab93e36-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.100437 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" event={"ID":"3377c36f-51de-4fca-9fe9-a0763ab93e36","Type":"ContainerDied","Data":"0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a"} Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.100509 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-9kvwl" event={"ID":"3377c36f-51de-4fca-9fe9-a0763ab93e36","Type":"ContainerDied","Data":"f1a33d317ecf161c450206573e2c42216d321109c4a4ab58a6948ad5d00808d8"} Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.100544 4702 scope.go:117] "RemoveContainer" containerID="0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.137509 4702 scope.go:117] "RemoveContainer" containerID="865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.170190 4702 scope.go:117] "RemoveContainer" containerID="0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a" Dec 03 11:30:49 crc kubenswrapper[4702]: E1203 11:30:49.170904 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a\": container with ID starting with 0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a not found: ID does not exist" containerID="0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.170970 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a"} err="failed to get container status \"0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a\": rpc error: code = NotFound desc = could not find container \"0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a\": container with ID starting with 0c22faf159ca7a88632dcfcd06a173cc46e5212117dfb17fee0bad7e0259968a not found: ID does not exist" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.171012 4702 scope.go:117] "RemoveContainer" containerID="865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3" Dec 03 11:30:49 crc kubenswrapper[4702]: E1203 11:30:49.171623 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3\": container with ID starting with 865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3 not found: ID does not exist" containerID="865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.171704 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3"} err="failed to get container status \"865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3\": rpc error: code = NotFound desc = could not find container \"865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3\": container with ID starting with 865162d8ade09fdf3815c744ee0430dc924ef59c711fee789ce094c3cd6d4bf3 not found: ID does not exist" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.177570 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b30e6f6-6da6-48ea-8e02-873d566d7719-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.177662 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b30e6f6-6da6-48ea-8e02-873d566d7719-openstack-config\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.177715 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b30e6f6-6da6-48ea-8e02-873d566d7719-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.177814 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ppjz\" (UniqueName: \"kubernetes.io/projected/1b30e6f6-6da6-48ea-8e02-873d566d7719-kube-api-access-7ppjz\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.178739 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1b30e6f6-6da6-48ea-8e02-873d566d7719-openstack-config\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.183470 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1b30e6f6-6da6-48ea-8e02-873d566d7719-openstack-config-secret\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.183656 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b30e6f6-6da6-48ea-8e02-873d566d7719-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.197454 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ppjz\" (UniqueName: \"kubernetes.io/projected/1b30e6f6-6da6-48ea-8e02-873d566d7719-kube-api-access-7ppjz\") pod \"openstackclient\" (UID: \"1b30e6f6-6da6-48ea-8e02-873d566d7719\") " pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.333889 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-9kvwl"] Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.607507 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 11:30:49 crc kubenswrapper[4702]: I1203 11:30:49.611558 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-9kvwl"] Dec 03 11:30:50 crc kubenswrapper[4702]: I1203 11:30:49.998532 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerStarted","Data":"2efed19eb2629e378bd74dcc562721354e3d42023cae3aba86d187d5ea0b56dd"} Dec 03 11:30:50 crc kubenswrapper[4702]: I1203 11:30:49.998867 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerStarted","Data":"c0c822db59095d950975d679fd1651fd6b7ad558b8f7ee5fe87a39760e0f965f"} Dec 03 11:30:50 crc kubenswrapper[4702]: I1203 11:30:50.223437 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 11:30:50 crc kubenswrapper[4702]: W1203 11:30:50.226077 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b30e6f6_6da6_48ea_8e02_873d566d7719.slice/crio-17349b11a675935190eb05c9c29a122ac403e93732450315730d42c5abbf9d61 WatchSource:0}: Error finding container 17349b11a675935190eb05c9c29a122ac403e93732450315730d42c5abbf9d61: Status 404 returned error can't find the container with id 17349b11a675935190eb05c9c29a122ac403e93732450315730d42c5abbf9d61 Dec 03 11:30:50 crc kubenswrapper[4702]: I1203 11:30:50.942203 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3377c36f-51de-4fca-9fe9-a0763ab93e36" path="/var/lib/kubelet/pods/3377c36f-51de-4fca-9fe9-a0763ab93e36/volumes" Dec 03 11:30:51 crc kubenswrapper[4702]: I1203 11:30:51.019975 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1b30e6f6-6da6-48ea-8e02-873d566d7719","Type":"ContainerStarted","Data":"17349b11a675935190eb05c9c29a122ac403e93732450315730d42c5abbf9d61"} Dec 03 11:30:51 crc kubenswrapper[4702]: I1203 11:30:51.023546 4702 generic.go:334] "Generic (PLEG): container finished" podID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerID="42955700812f9623178ccbc92caf41631a43361aa5a01a4991220b92ab294eca" exitCode=0 Dec 03 11:30:51 crc kubenswrapper[4702]: I1203 11:30:51.023585 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"479db84e-8b1b-4bcb-8711-a5a51ca22eb1","Type":"ContainerDied","Data":"42955700812f9623178ccbc92caf41631a43361aa5a01a4991220b92ab294eca"} Dec 03 11:30:51 crc kubenswrapper[4702]: I1203 11:30:51.988353 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7bdf6cdbff-lzm8t"] Dec 03 11:30:51 crc kubenswrapper[4702]: I1203 11:30:51.991082 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:51 crc kubenswrapper[4702]: I1203 11:30:51.994944 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-9s85m" Dec 03 11:30:51 crc kubenswrapper[4702]: I1203 11:30:51.995166 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 03 11:30:51 crc kubenswrapper[4702]: I1203 11:30:51.995287 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.005852 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7bdf6cdbff-lzm8t"] Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.073059 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerStarted","Data":"35896744cea89052cb0ba05b9992bc658a32384bd29f41f38b01194423c93106"} Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.073150 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.095807 4702 generic.go:334] "Generic (PLEG): container finished" podID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerID="fb5ae7c56fd334233f4fc58e0ec3338cfba7244ca7d32a8687111414888d7ea6" exitCode=0 Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.095889 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"479db84e-8b1b-4bcb-8711-a5a51ca22eb1","Type":"ContainerDied","Data":"fb5ae7c56fd334233f4fc58e0ec3338cfba7244ca7d32a8687111414888d7ea6"} Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.101256 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw4tb\" (UniqueName: \"kubernetes.io/projected/a2cf5990-eb05-4167-9d52-83186278f986-kube-api-access-cw4tb\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.101399 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data-custom\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.101437 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-combined-ca-bundle\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.101491 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.198156 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8263576759999998 podStartE2EDuration="8.198134089s" podCreationTimestamp="2025-12-03 11:30:44 +0000 UTC" firstStartedPulling="2025-12-03 11:30:46.045406333 +0000 UTC m=+1629.881334797" lastFinishedPulling="2025-12-03 11:30:51.417182746 +0000 UTC m=+1635.253111210" observedRunningTime="2025-12-03 11:30:52.1054603 +0000 UTC m=+1635.941388764" watchObservedRunningTime="2025-12-03 11:30:52.198134089 +0000 UTC m=+1636.034062553" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.204046 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76f9dbfbb9-sz9d7"] Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.206067 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.209260 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw4tb\" (UniqueName: \"kubernetes.io/projected/a2cf5990-eb05-4167-9d52-83186278f986-kube-api-access-cw4tb\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.209553 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data-custom\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.209587 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-combined-ca-bundle\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.209644 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.214637 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-d6rbj"] Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.214788 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.217122 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.223000 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.226935 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-combined-ca-bundle\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.231173 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76f9dbfbb9-sz9d7"] Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.234439 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data-custom\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.255457 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-d6rbj"] Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.272729 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw4tb\" (UniqueName: \"kubernetes.io/projected/a2cf5990-eb05-4167-9d52-83186278f986-kube-api-access-cw4tb\") pod \"heat-engine-7bdf6cdbff-lzm8t\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.310532 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-67bbbdff4c-p4d6b"] Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.312416 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315198 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-config\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315296 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315318 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data-custom\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315351 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315396 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-combined-ca-bundle\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315424 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315493 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315545 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315588 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrqz4\" (UniqueName: \"kubernetes.io/projected/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-kube-api-access-wrqz4\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.315606 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72z4m\" (UniqueName: \"kubernetes.io/projected/ab0f582e-799a-44ad-8529-6d0fe71490c2-kube-api-access-72z4m\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.328420 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.332790 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.342156 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-67bbbdff4c-p4d6b"] Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.483728 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data-custom\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.483875 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-config\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.483991 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.484022 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data-custom\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.485524 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-config\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.486110 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.486175 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.486256 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.486462 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-combined-ca-bundle\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.486538 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.486600 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-combined-ca-bundle\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.486720 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfcw\" (UniqueName: \"kubernetes.io/projected/0babb7be-7f54-4458-b44a-707bfd530e40-kube-api-access-fwfcw\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.486930 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.487000 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.487133 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrqz4\" (UniqueName: \"kubernetes.io/projected/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-kube-api-access-wrqz4\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.487196 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.487250 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72z4m\" (UniqueName: \"kubernetes.io/projected/ab0f582e-799a-44ad-8529-6d0fe71490c2-kube-api-access-72z4m\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.487928 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.488675 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.491499 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data-custom\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.493712 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-combined-ca-bundle\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.506745 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.513091 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrqz4\" (UniqueName: \"kubernetes.io/projected/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-kube-api-access-wrqz4\") pod \"heat-cfnapi-76f9dbfbb9-sz9d7\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.513093 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72z4m\" (UniqueName: \"kubernetes.io/projected/ab0f582e-799a-44ad-8529-6d0fe71490c2-kube-api-access-72z4m\") pod \"dnsmasq-dns-7756b9d78c-d6rbj\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.589368 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-combined-ca-bundle\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.589477 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfcw\" (UniqueName: \"kubernetes.io/projected/0babb7be-7f54-4458-b44a-707bfd530e40-kube-api-access-fwfcw\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.589561 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.589647 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data-custom\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.595034 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data-custom\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.601806 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-combined-ca-bundle\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.601948 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.627225 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfcw\" (UniqueName: \"kubernetes.io/projected/0babb7be-7f54-4458-b44a-707bfd530e40-kube-api-access-fwfcw\") pod \"heat-api-67bbbdff4c-p4d6b\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.632335 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.633069 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.659559 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:30:52 crc kubenswrapper[4702]: I1203 11:30:52.859210 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.015209 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data\") pod \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.015408 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data-custom\") pod \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.015726 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-scripts\") pod \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.015867 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-combined-ca-bundle\") pod \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.015893 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-etc-machine-id\") pod \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.015915 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64j9t\" (UniqueName: \"kubernetes.io/projected/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-kube-api-access-64j9t\") pod \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\" (UID: \"479db84e-8b1b-4bcb-8711-a5a51ca22eb1\") " Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.016638 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "479db84e-8b1b-4bcb-8711-a5a51ca22eb1" (UID: "479db84e-8b1b-4bcb-8711-a5a51ca22eb1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.024211 4702 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.033557 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-scripts" (OuterVolumeSpecName: "scripts") pod "479db84e-8b1b-4bcb-8711-a5a51ca22eb1" (UID: "479db84e-8b1b-4bcb-8711-a5a51ca22eb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.041113 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-kube-api-access-64j9t" (OuterVolumeSpecName: "kube-api-access-64j9t") pod "479db84e-8b1b-4bcb-8711-a5a51ca22eb1" (UID: "479db84e-8b1b-4bcb-8711-a5a51ca22eb1"). InnerVolumeSpecName "kube-api-access-64j9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.042975 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "479db84e-8b1b-4bcb-8711-a5a51ca22eb1" (UID: "479db84e-8b1b-4bcb-8711-a5a51ca22eb1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.128316 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.130855 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64j9t\" (UniqueName: \"kubernetes.io/projected/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-kube-api-access-64j9t\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.131173 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.199262 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.200694 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"479db84e-8b1b-4bcb-8711-a5a51ca22eb1","Type":"ContainerDied","Data":"d30bca3a7863e8b6ad0b942c2c454ded3aaf6f77f7e5d8cf235d537434ab8b98"} Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.200789 4702 scope.go:117] "RemoveContainer" containerID="42955700812f9623178ccbc92caf41631a43361aa5a01a4991220b92ab294eca" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.252238 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "479db84e-8b1b-4bcb-8711-a5a51ca22eb1" (UID: "479db84e-8b1b-4bcb-8711-a5a51ca22eb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.253531 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7bdf6cdbff-lzm8t"] Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.328285 4702 scope.go:117] "RemoveContainer" containerID="fb5ae7c56fd334233f4fc58e0ec3338cfba7244ca7d32a8687111414888d7ea6" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.343143 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.384935 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data" (OuterVolumeSpecName: "config-data") pod "479db84e-8b1b-4bcb-8711-a5a51ca22eb1" (UID: "479db84e-8b1b-4bcb-8711-a5a51ca22eb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.447029 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/479db84e-8b1b-4bcb-8711-a5a51ca22eb1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.706924 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.719184 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.731538 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:53 crc kubenswrapper[4702]: E1203 11:30:53.732062 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerName="cinder-scheduler" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.732075 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerName="cinder-scheduler" Dec 03 11:30:53 crc kubenswrapper[4702]: E1203 11:30:53.732125 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerName="probe" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.732130 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerName="probe" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.732341 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerName="probe" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.732357 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" containerName="cinder-scheduler" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.734430 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.746061 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.801764 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.861647 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzbc9\" (UniqueName: \"kubernetes.io/projected/58148f49-2721-4a0a-a5e0-38a2aa23522b-kube-api-access-fzbc9\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.862147 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-scripts\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.862278 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.862477 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.862696 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-config-data\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.862912 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58148f49-2721-4a0a-a5e0-38a2aa23522b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.941853 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76f9dbfbb9-sz9d7"] Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.970060 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-scripts\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.970132 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.970202 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.970272 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-config-data\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.970327 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58148f49-2721-4a0a-a5e0-38a2aa23522b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.970382 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzbc9\" (UniqueName: \"kubernetes.io/projected/58148f49-2721-4a0a-a5e0-38a2aa23522b-kube-api-access-fzbc9\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.984919 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58148f49-2721-4a0a-a5e0-38a2aa23522b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:53 crc kubenswrapper[4702]: I1203 11:30:53.987860 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-d6rbj"] Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.114343 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-67bbbdff4c-p4d6b"] Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.117884 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.118308 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-scripts\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.121682 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.124193 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58148f49-2721-4a0a-a5e0-38a2aa23522b-config-data\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.130446 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzbc9\" (UniqueName: \"kubernetes.io/projected/58148f49-2721-4a0a-a5e0-38a2aa23522b-kube-api-access-fzbc9\") pod \"cinder-scheduler-0\" (UID: \"58148f49-2721-4a0a-a5e0-38a2aa23522b\") " pod="openstack/cinder-scheduler-0" Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.272878 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" event={"ID":"d2dd9093-8cd4-4627-9019-8f1ec48a3b88","Type":"ContainerStarted","Data":"fe0fa362e2faedcdbce67517b1804fc4f64e1a010103de4d0332da748504f0b6"} Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.322162 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" event={"ID":"ab0f582e-799a-44ad-8529-6d0fe71490c2","Type":"ContainerStarted","Data":"6fb273780c501b5fa03deab78b9e2f3ddb4323078efb54dde935848679f4b17f"} Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.325972 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" event={"ID":"a2cf5990-eb05-4167-9d52-83186278f986","Type":"ContainerStarted","Data":"966cb54ffe2da9fb902297d512e09e2773f3ef4704a46031923cc407ef5eba0a"} Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.328672 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67bbbdff4c-p4d6b" event={"ID":"0babb7be-7f54-4458-b44a-707bfd530e40","Type":"ContainerStarted","Data":"7eb5994fde7c2c67465d96a6a5d3b43e7da0409c78c751a91206344600cf83a7"} Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.361724 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 11:30:54 crc kubenswrapper[4702]: I1203 11:30:54.959413 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479db84e-8b1b-4bcb-8711-a5a51ca22eb1" path="/var/lib/kubelet/pods/479db84e-8b1b-4bcb-8711-a5a51ca22eb1/volumes" Dec 03 11:30:55 crc kubenswrapper[4702]: I1203 11:30:55.172783 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 11:30:55 crc kubenswrapper[4702]: I1203 11:30:55.344538 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" event={"ID":"a2cf5990-eb05-4167-9d52-83186278f986","Type":"ContainerStarted","Data":"899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa"} Dec 03 11:30:55 crc kubenswrapper[4702]: I1203 11:30:55.346174 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:30:55 crc kubenswrapper[4702]: I1203 11:30:55.349873 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58148f49-2721-4a0a-a5e0-38a2aa23522b","Type":"ContainerStarted","Data":"09bed655da8b57a7355c7422765fb0a1cb7d0ed11f6f4bc871ac3c7bad0717f3"} Dec 03 11:30:55 crc kubenswrapper[4702]: I1203 11:30:55.352148 4702 generic.go:334] "Generic (PLEG): container finished" podID="ab0f582e-799a-44ad-8529-6d0fe71490c2" containerID="a8d69032f95d07a117914af342e611aebf5695f17b6460e6d5e7b2a9372378f4" exitCode=0 Dec 03 11:30:55 crc kubenswrapper[4702]: I1203 11:30:55.352208 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" event={"ID":"ab0f582e-799a-44ad-8529-6d0fe71490c2","Type":"ContainerDied","Data":"a8d69032f95d07a117914af342e611aebf5695f17b6460e6d5e7b2a9372378f4"} Dec 03 11:30:55 crc kubenswrapper[4702]: I1203 11:30:55.372948 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" podStartSLOduration=4.372897656 podStartE2EDuration="4.372897656s" podCreationTimestamp="2025-12-03 11:30:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:55.36462404 +0000 UTC m=+1639.200552504" watchObservedRunningTime="2025-12-03 11:30:55.372897656 +0000 UTC m=+1639.208826120" Dec 03 11:30:56 crc kubenswrapper[4702]: I1203 11:30:56.369138 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" event={"ID":"ab0f582e-799a-44ad-8529-6d0fe71490c2","Type":"ContainerStarted","Data":"a086ed1a32b499a91115208ed57efcf42a850ef49dc6d9bfced49c9ed309109a"} Dec 03 11:30:56 crc kubenswrapper[4702]: I1203 11:30:56.373544 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:30:56 crc kubenswrapper[4702]: I1203 11:30:56.394809 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" podStartSLOduration=4.394779781 podStartE2EDuration="4.394779781s" podCreationTimestamp="2025-12-03 11:30:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:30:56.388953815 +0000 UTC m=+1640.224882289" watchObservedRunningTime="2025-12-03 11:30:56.394779781 +0000 UTC m=+1640.230708245" Dec 03 11:30:57 crc kubenswrapper[4702]: I1203 11:30:57.148843 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 11:30:57 crc kubenswrapper[4702]: I1203 11:30:57.399165 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58148f49-2721-4a0a-a5e0-38a2aa23522b","Type":"ContainerStarted","Data":"5f108ef4356c88932b0eba73bb53c53d7024bb8c067d721f3a95f847709f1121"} Dec 03 11:31:01 crc kubenswrapper[4702]: I1203 11:31:01.817858 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-58868c9476-5hnsv"] Dec 03 11:31:01 crc kubenswrapper[4702]: I1203 11:31:01.821041 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:01 crc kubenswrapper[4702]: I1203 11:31:01.845660 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-58868c9476-5hnsv"] Dec 03 11:31:01 crc kubenswrapper[4702]: I1203 11:31:01.873820 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5b6449bd57-h4c9q"] Dec 03 11:31:01 crc kubenswrapper[4702]: I1203 11:31:01.875730 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:01 crc kubenswrapper[4702]: I1203 11:31:01.959726 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5b56468cc-dnldg"] Dec 03 11:31:01 crc kubenswrapper[4702]: I1203 11:31:01.969373 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.024957 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5b56468cc-dnldg"] Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.039969 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b6449bd57-h4c9q"] Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.072362 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-combined-ca-bundle\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.072496 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-combined-ca-bundle\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.072626 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.073692 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data-custom\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.073734 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.073828 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data-custom\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.074370 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tzwl\" (UniqueName: \"kubernetes.io/projected/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-kube-api-access-4tzwl\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.074434 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p77j\" (UniqueName: \"kubernetes.io/projected/c11395bc-981a-4efd-9fe3-8b0c146f375e-kube-api-access-5p77j\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177408 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data-custom\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177484 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177539 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data-custom\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177579 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177700 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data-custom\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177785 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tzwl\" (UniqueName: \"kubernetes.io/projected/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-kube-api-access-4tzwl\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177815 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p77j\" (UniqueName: \"kubernetes.io/projected/c11395bc-981a-4efd-9fe3-8b0c146f375e-kube-api-access-5p77j\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177882 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-combined-ca-bundle\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177932 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtf45\" (UniqueName: \"kubernetes.io/projected/d921fd19-6588-42dc-9f3a-8aadb96c5996-kube-api-access-dtf45\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.177997 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-combined-ca-bundle\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.178031 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-combined-ca-bundle\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.178167 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.189253 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.189488 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data-custom\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.189852 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.198388 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data-custom\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.207191 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-combined-ca-bundle\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.207252 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p77j\" (UniqueName: \"kubernetes.io/projected/c11395bc-981a-4efd-9fe3-8b0c146f375e-kube-api-access-5p77j\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.208748 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-combined-ca-bundle\") pod \"heat-api-5b6449bd57-h4c9q\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.210485 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tzwl\" (UniqueName: \"kubernetes.io/projected/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-kube-api-access-4tzwl\") pod \"heat-engine-58868c9476-5hnsv\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.274547 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.280159 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.280226 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data-custom\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.280326 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtf45\" (UniqueName: \"kubernetes.io/projected/d921fd19-6588-42dc-9f3a-8aadb96c5996-kube-api-access-dtf45\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.280362 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-combined-ca-bundle\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.286820 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-combined-ca-bundle\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.286922 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data-custom\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.287956 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.364438 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtf45\" (UniqueName: \"kubernetes.io/projected/d921fd19-6588-42dc-9f3a-8aadb96c5996-kube-api-access-dtf45\") pod \"heat-cfnapi-5b56468cc-dnldg\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.388614 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.467463 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.636276 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.732886 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-kmfbx"] Dec 03 11:31:02 crc kubenswrapper[4702]: I1203 11:31:02.733190 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerName="dnsmasq-dns" containerID="cri-o://c309e395ef4019bf0f57f59b85129ecab952875892c0541a9d9208601fddf934" gracePeriod=10 Dec 03 11:31:03 crc kubenswrapper[4702]: I1203 11:31:03.680003 4702 generic.go:334] "Generic (PLEG): container finished" podID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerID="c309e395ef4019bf0f57f59b85129ecab952875892c0541a9d9208601fddf934" exitCode=0 Dec 03 11:31:03 crc kubenswrapper[4702]: I1203 11:31:03.680141 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" event={"ID":"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5","Type":"ContainerDied","Data":"c309e395ef4019bf0f57f59b85129ecab952875892c0541a9d9208601fddf934"} Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.630958 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-868d57787f-bntsv"] Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.639918 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.670787 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.673483 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.673594 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.682086 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-868d57787f-bntsv"] Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.724448 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58148f49-2721-4a0a-a5e0-38a2aa23522b","Type":"ContainerStarted","Data":"dc72c90c2b5a5b40510681d5e8f343edf5612866e06e1f4daa390d6424053841"} Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.742635 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-config-data\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.742833 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57179545-ef9c-460f-9ef6-219c895dc9fa-etc-swift\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.742931 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57179545-ef9c-460f-9ef6-219c895dc9fa-log-httpd\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.742988 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-internal-tls-certs\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.743111 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-combined-ca-bundle\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.743197 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-public-tls-certs\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.743477 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xplbz\" (UniqueName: \"kubernetes.io/projected/57179545-ef9c-460f-9ef6-219c895dc9fa-kube-api-access-xplbz\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.743521 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57179545-ef9c-460f-9ef6-219c895dc9fa-run-httpd\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.808985 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=11.808959689 podStartE2EDuration="11.808959689s" podCreationTimestamp="2025-12-03 11:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:04.752845011 +0000 UTC m=+1648.588773495" watchObservedRunningTime="2025-12-03 11:31:04.808959689 +0000 UTC m=+1648.644888143" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.846320 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xplbz\" (UniqueName: \"kubernetes.io/projected/57179545-ef9c-460f-9ef6-219c895dc9fa-kube-api-access-xplbz\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.846450 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57179545-ef9c-460f-9ef6-219c895dc9fa-run-httpd\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.846863 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-config-data\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.847001 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57179545-ef9c-460f-9ef6-219c895dc9fa-etc-swift\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.847094 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57179545-ef9c-460f-9ef6-219c895dc9fa-log-httpd\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.847160 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-internal-tls-certs\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.847270 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-combined-ca-bundle\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.847370 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-public-tls-certs\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.849421 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57179545-ef9c-460f-9ef6-219c895dc9fa-run-httpd\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.850210 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57179545-ef9c-460f-9ef6-219c895dc9fa-log-httpd\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.863499 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-public-tls-certs\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.864468 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-internal-tls-certs\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.865963 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-config-data\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.869420 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/57179545-ef9c-460f-9ef6-219c895dc9fa-etc-swift\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.892884 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xplbz\" (UniqueName: \"kubernetes.io/projected/57179545-ef9c-460f-9ef6-219c895dc9fa-kube-api-access-xplbz\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:04 crc kubenswrapper[4702]: I1203 11:31:04.932709 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57179545-ef9c-460f-9ef6-219c895dc9fa-combined-ca-bundle\") pod \"swift-proxy-868d57787f-bntsv\" (UID: \"57179545-ef9c-460f-9ef6-219c895dc9fa\") " pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.012443 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.215828 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-67bbbdff4c-p4d6b"] Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.250590 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76f9dbfbb9-sz9d7"] Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.309701 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5df49db6bf-cnm46"] Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.311684 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.319069 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.319315 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.360504 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5df49db6bf-cnm46"] Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.370579 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data-custom\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.370774 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.370824 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-internal-tls-certs\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.370881 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-combined-ca-bundle\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.370994 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-public-tls-certs\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.371031 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r866x\" (UniqueName: \"kubernetes.io/projected/6fad49cd-d636-43cc-84f7-7c8e0774a93a-kube-api-access-r866x\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.393245 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76f7956d4d-b967j"] Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.395364 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.400316 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.400520 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.406622 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76f7956d4d-b967j"] Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.473894 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-combined-ca-bundle\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474067 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-public-tls-certs\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474116 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r866x\" (UniqueName: \"kubernetes.io/projected/6fad49cd-d636-43cc-84f7-7c8e0774a93a-kube-api-access-r866x\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474216 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-internal-tls-certs\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474360 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data-custom\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474392 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s57z7\" (UniqueName: \"kubernetes.io/projected/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-kube-api-access-s57z7\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474531 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data-custom\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474580 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474611 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474660 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-internal-tls-certs\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.474880 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-combined-ca-bundle\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.475476 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-public-tls-certs\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.492713 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-public-tls-certs\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.494813 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data-custom\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.495564 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-combined-ca-bundle\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.495920 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.496458 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-internal-tls-certs\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.503823 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r866x\" (UniqueName: \"kubernetes.io/projected/6fad49cd-d636-43cc-84f7-7c8e0774a93a-kube-api-access-r866x\") pod \"heat-api-5df49db6bf-cnm46\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.578155 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-public-tls-certs\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.578237 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-combined-ca-bundle\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.578367 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-internal-tls-certs\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.578456 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s57z7\" (UniqueName: \"kubernetes.io/projected/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-kube-api-access-s57z7\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.578895 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data-custom\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.578936 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.582667 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-public-tls-certs\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.585779 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-combined-ca-bundle\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.586926 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-internal-tls-certs\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.590600 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data-custom\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.590693 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.607742 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s57z7\" (UniqueName: \"kubernetes.io/projected/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-kube-api-access-s57z7\") pod \"heat-cfnapi-76f7956d4d-b967j\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:05 crc kubenswrapper[4702]: I1203 11:31:05.642228 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:06 crc kubenswrapper[4702]: I1203 11:31:06.019039 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:07 crc kubenswrapper[4702]: I1203 11:31:07.137702 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.204:5353: connect: connection refused" Dec 03 11:31:09 crc kubenswrapper[4702]: I1203 11:31:09.364256 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 11:31:09 crc kubenswrapper[4702]: I1203 11:31:09.651782 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 11:31:09 crc kubenswrapper[4702]: I1203 11:31:09.769001 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:09 crc kubenswrapper[4702]: I1203 11:31:09.769899 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="ceilometer-central-agent" containerID="cri-o://1136264585a2d16f4270d48cb3a9e7d82fd90f47ebde7d44b2ff6929d1ef7724" gracePeriod=30 Dec 03 11:31:09 crc kubenswrapper[4702]: I1203 11:31:09.770124 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="proxy-httpd" containerID="cri-o://35896744cea89052cb0ba05b9992bc658a32384bd29f41f38b01194423c93106" gracePeriod=30 Dec 03 11:31:09 crc kubenswrapper[4702]: I1203 11:31:09.769940 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="ceilometer-notification-agent" containerID="cri-o://2efed19eb2629e378bd74dcc562721354e3d42023cae3aba86d187d5ea0b56dd" gracePeriod=30 Dec 03 11:31:09 crc kubenswrapper[4702]: I1203 11:31:09.769912 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="sg-core" containerID="cri-o://c0c822db59095d950975d679fd1651fd6b7ad558b8f7ee5fe87a39760e0f965f" gracePeriod=30 Dec 03 11:31:09 crc kubenswrapper[4702]: I1203 11:31:09.780745 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 11:31:10 crc kubenswrapper[4702]: I1203 11:31:10.396878 4702 generic.go:334] "Generic (PLEG): container finished" podID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerID="35896744cea89052cb0ba05b9992bc658a32384bd29f41f38b01194423c93106" exitCode=0 Dec 03 11:31:10 crc kubenswrapper[4702]: I1203 11:31:10.397224 4702 generic.go:334] "Generic (PLEG): container finished" podID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerID="c0c822db59095d950975d679fd1651fd6b7ad558b8f7ee5fe87a39760e0f965f" exitCode=2 Dec 03 11:31:10 crc kubenswrapper[4702]: I1203 11:31:10.397236 4702 generic.go:334] "Generic (PLEG): container finished" podID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerID="1136264585a2d16f4270d48cb3a9e7d82fd90f47ebde7d44b2ff6929d1ef7724" exitCode=0 Dec 03 11:31:10 crc kubenswrapper[4702]: I1203 11:31:10.396968 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerDied","Data":"35896744cea89052cb0ba05b9992bc658a32384bd29f41f38b01194423c93106"} Dec 03 11:31:10 crc kubenswrapper[4702]: I1203 11:31:10.397341 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerDied","Data":"c0c822db59095d950975d679fd1651fd6b7ad558b8f7ee5fe87a39760e0f965f"} Dec 03 11:31:10 crc kubenswrapper[4702]: I1203 11:31:10.397365 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerDied","Data":"1136264585a2d16f4270d48cb3a9e7d82fd90f47ebde7d44b2ff6929d1ef7724"} Dec 03 11:31:11 crc kubenswrapper[4702]: I1203 11:31:11.428671 4702 generic.go:334] "Generic (PLEG): container finished" podID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerID="2efed19eb2629e378bd74dcc562721354e3d42023cae3aba86d187d5ea0b56dd" exitCode=0 Dec 03 11:31:11 crc kubenswrapper[4702]: I1203 11:31:11.428797 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerDied","Data":"2efed19eb2629e378bd74dcc562721354e3d42023cae3aba86d187d5ea0b56dd"} Dec 03 11:31:12 crc kubenswrapper[4702]: I1203 11:31:12.137164 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.204:5353: connect: connection refused" Dec 03 11:31:12 crc kubenswrapper[4702]: I1203 11:31:12.399911 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:31:13 crc kubenswrapper[4702]: E1203 11:31:13.167458 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified" Dec 03 11:31:13 crc kubenswrapper[4702]: E1203 11:31:13.168085 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstackclient,Image:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,Command:[/bin/sleep],Args:[infinity],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n649h56fh98h8hf5h577h5fh65fh676hf6h5cbh5bch599h69h667h585hb9h66h566h85hf4hcfh597hd6h596hf7hdbh59dh68ch5f9hf6hd6q,ValueFrom:nil,},EnvVar{Name:OS_CLOUD,Value:default,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_CA_CERT,Value:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_HOST,Value:metric-storage-prometheus.openstack.svc,ValueFrom:nil,},EnvVar{Name:PROMETHEUS_PORT,Value:9090,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openstack-config,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/.config/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/home/cloud-admin/cloudrc,SubPath:cloudrc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7ppjz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42401,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42401,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstackclient_openstack(1b30e6f6-6da6-48ea-8e02-873d566d7719): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:31:13 crc kubenswrapper[4702]: E1203 11:31:13.172011 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstackclient" podUID="1b30e6f6-6da6-48ea-8e02-873d566d7719" Dec 03 11:31:13 crc kubenswrapper[4702]: E1203 11:31:13.482584 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstackclient\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\"\"" pod="openstack/openstackclient" podUID="1b30e6f6-6da6-48ea-8e02-873d566d7719" Dec 03 11:31:13 crc kubenswrapper[4702]: I1203 11:31:13.954801 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b6449bd57-h4c9q"] Dec 03 11:31:13 crc kubenswrapper[4702]: I1203 11:31:13.984292 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5b56468cc-dnldg"] Dec 03 11:31:14 crc kubenswrapper[4702]: W1203 11:31:14.051508 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd921fd19_6588_42dc_9f3a_8aadb96c5996.slice/crio-de7f62e53a2398a3d617a293302542bef4ef5e59d1f96edb59a93c74e8b671e7 WatchSource:0}: Error finding container de7f62e53a2398a3d617a293302542bef4ef5e59d1f96edb59a93c74e8b671e7: Status 404 returned error can't find the container with id de7f62e53a2398a3d617a293302542bef4ef5e59d1f96edb59a93c74e8b671e7 Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.271183 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-868d57787f-bntsv"] Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.449564 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.457879 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.594789 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-swift-storage-0\") pod \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.606624 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" event={"ID":"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5","Type":"ContainerDied","Data":"9fddd6a921c141ab679a9e86abc0e17e535bb64a23e2d3297bebdfd0eb06753b"} Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.620710 4702 scope.go:117] "RemoveContainer" containerID="c309e395ef4019bf0f57f59b85129ecab952875892c0541a9d9208601fddf934" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.606711 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-kmfbx" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.622368 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b6449bd57-h4c9q" event={"ID":"c11395bc-981a-4efd-9fe3-8b0c146f375e","Type":"ContainerStarted","Data":"c8e5b4be2b92df0ef2630249a5cd4d2d7e6c728398b28fa8068cee400e588fcc"} Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.627137 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-config\") pod \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.627594 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-nb\") pod \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.627769 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6kf7\" (UniqueName: \"kubernetes.io/projected/dd27c847-d92c-4adb-bd1a-d445ae5bd182-kube-api-access-w6kf7\") pod \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.627930 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-svc\") pod \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.628064 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwdfz\" (UniqueName: \"kubernetes.io/projected/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-kube-api-access-kwdfz\") pod \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.628221 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-combined-ca-bundle\") pod \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.628327 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-run-httpd\") pod \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.628440 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-config-data\") pod \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.628527 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-log-httpd\") pod \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.628654 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-scripts\") pod \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.628904 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-sb\") pod \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\" (UID: \"a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.629039 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-sg-core-conf-yaml\") pod \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\" (UID: \"dd27c847-d92c-4adb-bd1a-d445ae5bd182\") " Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.635407 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd27c847-d92c-4adb-bd1a-d445ae5bd182" (UID: "dd27c847-d92c-4adb-bd1a-d445ae5bd182"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.640143 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd27c847-d92c-4adb-bd1a-d445ae5bd182" (UID: "dd27c847-d92c-4adb-bd1a-d445ae5bd182"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.650174 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67bbbdff4c-p4d6b" event={"ID":"0babb7be-7f54-4458-b44a-707bfd530e40","Type":"ContainerStarted","Data":"5a726c16dfdd16020fe752214fa8a86d2405848c57001073dc67ff2799b8ffb9"} Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.650421 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-67bbbdff4c-p4d6b" podUID="0babb7be-7f54-4458-b44a-707bfd530e40" containerName="heat-api" containerID="cri-o://5a726c16dfdd16020fe752214fa8a86d2405848c57001073dc67ff2799b8ffb9" gracePeriod=60 Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.650790 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.692835 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-58868c9476-5hnsv"] Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.703978 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-67bbbdff4c-p4d6b" podStartSLOduration=4.151926774 podStartE2EDuration="22.703941905s" podCreationTimestamp="2025-12-03 11:30:52 +0000 UTC" firstStartedPulling="2025-12-03 11:30:54.214084879 +0000 UTC m=+1638.050013343" lastFinishedPulling="2025-12-03 11:31:12.76610001 +0000 UTC m=+1656.602028474" observedRunningTime="2025-12-03 11:31:14.685327955 +0000 UTC m=+1658.521256429" watchObservedRunningTime="2025-12-03 11:31:14.703941905 +0000 UTC m=+1658.539870369" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.704120 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.704024 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd27c847-d92c-4adb-bd1a-d445ae5bd182","Type":"ContainerDied","Data":"4fee63618193cf31971e2d68b64d5cb9e10ac18d57e977cef600e48b797a4495"} Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.719202 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-868d57787f-bntsv" event={"ID":"57179545-ef9c-460f-9ef6-219c895dc9fa","Type":"ContainerStarted","Data":"a6687ad2a329c80aabf3e67344ecd5ee43c75242f103365f4cc0dabc496eb454"} Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.733039 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.733084 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd27c847-d92c-4adb-bd1a-d445ae5bd182-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.738479 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-scripts" (OuterVolumeSpecName: "scripts") pod "dd27c847-d92c-4adb-bd1a-d445ae5bd182" (UID: "dd27c847-d92c-4adb-bd1a-d445ae5bd182"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.738869 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-kube-api-access-kwdfz" (OuterVolumeSpecName: "kube-api-access-kwdfz") pod "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" (UID: "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5"). InnerVolumeSpecName "kube-api-access-kwdfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.739049 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b56468cc-dnldg" event={"ID":"d921fd19-6588-42dc-9f3a-8aadb96c5996","Type":"ContainerStarted","Data":"de7f62e53a2398a3d617a293302542bef4ef5e59d1f96edb59a93c74e8b671e7"} Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.748495 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" event={"ID":"d2dd9093-8cd4-4627-9019-8f1ec48a3b88","Type":"ContainerStarted","Data":"ed8192353a8e24d51a5100f448a7dc9fafc56909816663b856e0e043d99d18bd"} Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.749014 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" podUID="d2dd9093-8cd4-4627-9019-8f1ec48a3b88" containerName="heat-cfnapi" containerID="cri-o://ed8192353a8e24d51a5100f448a7dc9fafc56909816663b856e0e043d99d18bd" gracePeriod=60 Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.749717 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.754842 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd27c847-d92c-4adb-bd1a-d445ae5bd182-kube-api-access-w6kf7" (OuterVolumeSpecName: "kube-api-access-w6kf7") pod "dd27c847-d92c-4adb-bd1a-d445ae5bd182" (UID: "dd27c847-d92c-4adb-bd1a-d445ae5bd182"). InnerVolumeSpecName "kube-api-access-w6kf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.758620 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5df49db6bf-cnm46"] Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.797343 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" podStartSLOduration=4.216712399 podStartE2EDuration="22.797314455s" podCreationTimestamp="2025-12-03 11:30:52 +0000 UTC" firstStartedPulling="2025-12-03 11:30:54.187818011 +0000 UTC m=+1638.023746475" lastFinishedPulling="2025-12-03 11:31:12.768420067 +0000 UTC m=+1656.604348531" observedRunningTime="2025-12-03 11:31:14.778346694 +0000 UTC m=+1658.614275178" watchObservedRunningTime="2025-12-03 11:31:14.797314455 +0000 UTC m=+1658.633242919" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.797840 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76f7956d4d-b967j"] Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.837713 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6kf7\" (UniqueName: \"kubernetes.io/projected/dd27c847-d92c-4adb-bd1a-d445ae5bd182-kube-api-access-w6kf7\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.837770 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwdfz\" (UniqueName: \"kubernetes.io/projected/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-kube-api-access-kwdfz\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.837783 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.900026 4702 scope.go:117] "RemoveContainer" containerID="d341171d8d95e0543b60ebce4ab006cf17df2ddb717cb5de82de7daa8cd256f9" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.976929 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-config" (OuterVolumeSpecName: "config") pod "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" (UID: "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:14 crc kubenswrapper[4702]: I1203 11:31:14.994557 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" (UID: "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.042071 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.042379 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.051144 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd27c847-d92c-4adb-bd1a-d445ae5bd182" (UID: "dd27c847-d92c-4adb-bd1a-d445ae5bd182"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.064556 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" (UID: "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.099406 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd27c847-d92c-4adb-bd1a-d445ae5bd182" (UID: "dd27c847-d92c-4adb-bd1a-d445ae5bd182"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.104445 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" (UID: "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.109364 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" (UID: "a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.144351 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.144390 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.144399 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.144408 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.144416 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.183507 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-config-data" (OuterVolumeSpecName: "config-data") pod "dd27c847-d92c-4adb-bd1a-d445ae5bd182" (UID: "dd27c847-d92c-4adb-bd1a-d445ae5bd182"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.246432 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd27c847-d92c-4adb-bd1a-d445ae5bd182-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.295147 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-kmfbx"] Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.325525 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-kmfbx"] Dec 03 11:31:15 crc kubenswrapper[4702]: E1203 11:31:15.355029 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0babb7be_7f54_4458_b44a_707bfd530e40.slice/crio-5a726c16dfdd16020fe752214fa8a86d2405848c57001073dc67ff2799b8ffb9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6c7eaad_7121_41a5_b4b6_ea5e079ff1c5.slice\": RecentStats: unable to find data in memory cache]" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.380655 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.405486 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.428205 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:15 crc kubenswrapper[4702]: E1203 11:31:15.428979 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="ceilometer-notification-agent" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429007 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="ceilometer-notification-agent" Dec 03 11:31:15 crc kubenswrapper[4702]: E1203 11:31:15.429050 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerName="dnsmasq-dns" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429060 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerName="dnsmasq-dns" Dec 03 11:31:15 crc kubenswrapper[4702]: E1203 11:31:15.429080 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="ceilometer-central-agent" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429088 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="ceilometer-central-agent" Dec 03 11:31:15 crc kubenswrapper[4702]: E1203 11:31:15.429119 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerName="init" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429127 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerName="init" Dec 03 11:31:15 crc kubenswrapper[4702]: E1203 11:31:15.429139 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="proxy-httpd" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429146 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="proxy-httpd" Dec 03 11:31:15 crc kubenswrapper[4702]: E1203 11:31:15.429179 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="sg-core" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429187 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="sg-core" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429514 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" containerName="dnsmasq-dns" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429538 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="sg-core" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429555 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="ceilometer-central-agent" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429575 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="proxy-httpd" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.429591 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" containerName="ceilometer-notification-agent" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.432479 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.444337 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.444439 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.444595 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.463217 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-scripts\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.463344 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.463390 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.463462 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-log-httpd\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.463514 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-run-httpd\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.463572 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-config-data\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.463631 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vcc\" (UniqueName: \"kubernetes.io/projected/2893bcd3-157b-47f2-92c2-40b61ad8e125-kube-api-access-x9vcc\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.537427 4702 scope.go:117] "RemoveContainer" containerID="35896744cea89052cb0ba05b9992bc658a32384bd29f41f38b01194423c93106" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.566144 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-config-data\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.566234 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vcc\" (UniqueName: \"kubernetes.io/projected/2893bcd3-157b-47f2-92c2-40b61ad8e125-kube-api-access-x9vcc\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.566343 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-scripts\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.566443 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.566483 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.566565 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-log-httpd\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.566628 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-run-httpd\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.567236 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-run-httpd\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.569470 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-log-httpd\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.572342 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.572653 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-scripts\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.574380 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.582455 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-config-data\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.603239 4702 scope.go:117] "RemoveContainer" containerID="c0c822db59095d950975d679fd1651fd6b7ad558b8f7ee5fe87a39760e0f965f" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.605548 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vcc\" (UniqueName: \"kubernetes.io/projected/2893bcd3-157b-47f2-92c2-40b61ad8e125-kube-api-access-x9vcc\") pod \"ceilometer-0\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.643147 4702 scope.go:117] "RemoveContainer" containerID="2efed19eb2629e378bd74dcc562721354e3d42023cae3aba86d187d5ea0b56dd" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.670187 4702 scope.go:117] "RemoveContainer" containerID="1136264585a2d16f4270d48cb3a9e7d82fd90f47ebde7d44b2ff6929d1ef7724" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.772275 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.776194 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b6449bd57-h4c9q" event={"ID":"c11395bc-981a-4efd-9fe3-8b0c146f375e","Type":"ContainerStarted","Data":"e5a9100d2921a8e9494516757bf2f727566cf10e161f3d23bd914428e175bfbe"} Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.776253 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.781021 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b56468cc-dnldg" event={"ID":"d921fd19-6588-42dc-9f3a-8aadb96c5996","Type":"ContainerStarted","Data":"43e4067590228d6c4c544fa359b10de23e7e6d6ea28a34191965fcd7b8890b94"} Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.782269 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.786544 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5df49db6bf-cnm46" event={"ID":"6fad49cd-d636-43cc-84f7-7c8e0774a93a","Type":"ContainerStarted","Data":"0f0e478ec360aa48bfda13635430903340e00a91028f0bddc950839af3f38432"} Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.796916 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5b6449bd57-h4c9q" podStartSLOduration=14.796887295 podStartE2EDuration="14.796887295s" podCreationTimestamp="2025-12-03 11:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:15.794833497 +0000 UTC m=+1659.630761981" watchObservedRunningTime="2025-12-03 11:31:15.796887295 +0000 UTC m=+1659.632815759" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.804945 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76f7956d4d-b967j" event={"ID":"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa","Type":"ContainerStarted","Data":"8cfdac583e30b25446aebedf4e8958b8a59d51bfa8357b88f1f55772ce4fe64a"} Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.828048 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5b56468cc-dnldg" podStartSLOduration=14.828019082 podStartE2EDuration="14.828019082s" podCreationTimestamp="2025-12-03 11:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:15.821320781 +0000 UTC m=+1659.657249255" watchObservedRunningTime="2025-12-03 11:31:15.828019082 +0000 UTC m=+1659.663947546" Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.832200 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2dd9093-8cd4-4627-9019-8f1ec48a3b88" containerID="ed8192353a8e24d51a5100f448a7dc9fafc56909816663b856e0e043d99d18bd" exitCode=0 Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.832333 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" event={"ID":"d2dd9093-8cd4-4627-9019-8f1ec48a3b88","Type":"ContainerDied","Data":"ed8192353a8e24d51a5100f448a7dc9fafc56909816663b856e0e043d99d18bd"} Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.838732 4702 generic.go:334] "Generic (PLEG): container finished" podID="0babb7be-7f54-4458-b44a-707bfd530e40" containerID="5a726c16dfdd16020fe752214fa8a86d2405848c57001073dc67ff2799b8ffb9" exitCode=0 Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.838870 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67bbbdff4c-p4d6b" event={"ID":"0babb7be-7f54-4458-b44a-707bfd530e40","Type":"ContainerDied","Data":"5a726c16dfdd16020fe752214fa8a86d2405848c57001073dc67ff2799b8ffb9"} Dec 03 11:31:15 crc kubenswrapper[4702]: I1203 11:31:15.846825 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-58868c9476-5hnsv" event={"ID":"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852","Type":"ContainerStarted","Data":"4d657cd1f5984aa69a02a0e558e8018930811ea1a137298708a4a5da20893108"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.784074 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.788185 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.888904 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" event={"ID":"d2dd9093-8cd4-4627-9019-8f1ec48a3b88","Type":"ContainerDied","Data":"fe0fa362e2faedcdbce67517b1804fc4f64e1a010103de4d0332da748504f0b6"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.888970 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe0fa362e2faedcdbce67517b1804fc4f64e1a010103de4d0332da748504f0b6" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.897750 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.900151 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-67bbbdff4c-p4d6b" event={"ID":"0babb7be-7f54-4458-b44a-707bfd530e40","Type":"ContainerDied","Data":"7eb5994fde7c2c67465d96a6a5d3b43e7da0409c78c751a91206344600cf83a7"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.900223 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eb5994fde7c2c67465d96a6a5d3b43e7da0409c78c751a91206344600cf83a7" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.902675 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-868d57787f-bntsv" event={"ID":"57179545-ef9c-460f-9ef6-219c895dc9fa","Type":"ContainerStarted","Data":"576d02c0e977fa357380e8dd7daaf04faf38d5fc4c3a762342d4ecb0b798729f"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.905315 4702 generic.go:334] "Generic (PLEG): container finished" podID="d921fd19-6588-42dc-9f3a-8aadb96c5996" containerID="43e4067590228d6c4c544fa359b10de23e7e6d6ea28a34191965fcd7b8890b94" exitCode=1 Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.905437 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b56468cc-dnldg" event={"ID":"d921fd19-6588-42dc-9f3a-8aadb96c5996","Type":"ContainerDied","Data":"43e4067590228d6c4c544fa359b10de23e7e6d6ea28a34191965fcd7b8890b94"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.906211 4702 scope.go:117] "RemoveContainer" containerID="43e4067590228d6c4c544fa359b10de23e7e6d6ea28a34191965fcd7b8890b94" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.910211 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76f7956d4d-b967j" event={"ID":"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa","Type":"ContainerStarted","Data":"78575d1b4cdcb62d54fbf3e36fbe1e08f77e1f1c00c756579b087203c7eeb1c3"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.911554 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.924461 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5df49db6bf-cnm46" event={"ID":"6fad49cd-d636-43cc-84f7-7c8e0774a93a","Type":"ContainerStarted","Data":"beb14e6e68c81110a37a5577e4ccedc4cc03bc33e7b8ba69d46f559f82443abe"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.924650 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.934375 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.950002 4702 generic.go:334] "Generic (PLEG): container finished" podID="c11395bc-981a-4efd-9fe3-8b0c146f375e" containerID="e5a9100d2921a8e9494516757bf2f727566cf10e161f3d23bd914428e175bfbe" exitCode=1 Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.951276 4702 scope.go:117] "RemoveContainer" containerID="e5a9100d2921a8e9494516757bf2f727566cf10e161f3d23bd914428e175bfbe" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.951838 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5" path="/var/lib/kubelet/pods/a6c7eaad-7121-41a5-b4b6-ea5e079ff1c5/volumes" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.954247 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd27c847-d92c-4adb-bd1a-d445ae5bd182" path="/var/lib/kubelet/pods/dd27c847-d92c-4adb-bd1a-d445ae5bd182/volumes" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.960697 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-76f7956d4d-b967j" podStartSLOduration=11.960668872 podStartE2EDuration="11.960668872s" podCreationTimestamp="2025-12-03 11:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:16.957241985 +0000 UTC m=+1660.793170459" watchObservedRunningTime="2025-12-03 11:31:16.960668872 +0000 UTC m=+1660.796597346" Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.973368 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerStarted","Data":"12347b7e4700df6e9fa03e76a542876a9cfa983247dcd333f63d4403a38c5c11"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.973424 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b6449bd57-h4c9q" event={"ID":"c11395bc-981a-4efd-9fe3-8b0c146f375e","Type":"ContainerDied","Data":"e5a9100d2921a8e9494516757bf2f727566cf10e161f3d23bd914428e175bfbe"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.990028 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-58868c9476-5hnsv" event={"ID":"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852","Type":"ContainerStarted","Data":"8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55"} Dec 03 11:31:16 crc kubenswrapper[4702]: I1203 11:31:16.990383 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.087510 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data-custom\") pod \"0babb7be-7f54-4458-b44a-707bfd530e40\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.087600 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data\") pod \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.087633 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwfcw\" (UniqueName: \"kubernetes.io/projected/0babb7be-7f54-4458-b44a-707bfd530e40-kube-api-access-fwfcw\") pod \"0babb7be-7f54-4458-b44a-707bfd530e40\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.087684 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data\") pod \"0babb7be-7f54-4458-b44a-707bfd530e40\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.087717 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrqz4\" (UniqueName: \"kubernetes.io/projected/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-kube-api-access-wrqz4\") pod \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.087751 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-combined-ca-bundle\") pod \"0babb7be-7f54-4458-b44a-707bfd530e40\" (UID: \"0babb7be-7f54-4458-b44a-707bfd530e40\") " Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.087842 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-combined-ca-bundle\") pod \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.087937 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data-custom\") pod \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\" (UID: \"d2dd9093-8cd4-4627-9019-8f1ec48a3b88\") " Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.130283 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5df49db6bf-cnm46" podStartSLOduration=12.130248512 podStartE2EDuration="12.130248512s" podCreationTimestamp="2025-12-03 11:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:17.10346977 +0000 UTC m=+1660.939398234" watchObservedRunningTime="2025-12-03 11:31:17.130248512 +0000 UTC m=+1660.966176976" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.152108 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0babb7be-7f54-4458-b44a-707bfd530e40-kube-api-access-fwfcw" (OuterVolumeSpecName: "kube-api-access-fwfcw") pod "0babb7be-7f54-4458-b44a-707bfd530e40" (UID: "0babb7be-7f54-4458-b44a-707bfd530e40"). InnerVolumeSpecName "kube-api-access-fwfcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.152305 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0babb7be-7f54-4458-b44a-707bfd530e40" (UID: "0babb7be-7f54-4458-b44a-707bfd530e40"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.167262 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-kube-api-access-wrqz4" (OuterVolumeSpecName: "kube-api-access-wrqz4") pod "d2dd9093-8cd4-4627-9019-8f1ec48a3b88" (UID: "d2dd9093-8cd4-4627-9019-8f1ec48a3b88"). InnerVolumeSpecName "kube-api-access-wrqz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.169448 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-58868c9476-5hnsv" podStartSLOduration=16.169419558 podStartE2EDuration="16.169419558s" podCreationTimestamp="2025-12-03 11:31:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:17.136179911 +0000 UTC m=+1660.972108375" watchObservedRunningTime="2025-12-03 11:31:17.169419558 +0000 UTC m=+1661.005348022" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.169920 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d2dd9093-8cd4-4627-9019-8f1ec48a3b88" (UID: "d2dd9093-8cd4-4627-9019-8f1ec48a3b88"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.218737 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.219112 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.219128 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwfcw\" (UniqueName: \"kubernetes.io/projected/0babb7be-7f54-4458-b44a-707bfd530e40-kube-api-access-fwfcw\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.219187 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrqz4\" (UniqueName: \"kubernetes.io/projected/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-kube-api-access-wrqz4\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.238930 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2dd9093-8cd4-4627-9019-8f1ec48a3b88" (UID: "d2dd9093-8cd4-4627-9019-8f1ec48a3b88"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.255163 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0babb7be-7f54-4458-b44a-707bfd530e40" (UID: "0babb7be-7f54-4458-b44a-707bfd530e40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.277107 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.289851 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data" (OuterVolumeSpecName: "config-data") pod "0babb7be-7f54-4458-b44a-707bfd530e40" (UID: "0babb7be-7f54-4458-b44a-707bfd530e40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.312938 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data" (OuterVolumeSpecName: "config-data") pod "d2dd9093-8cd4-4627-9019-8f1ec48a3b88" (UID: "d2dd9093-8cd4-4627-9019-8f1ec48a3b88"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.321513 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.321565 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.321581 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babb7be-7f54-4458-b44a-707bfd530e40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.321598 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2dd9093-8cd4-4627-9019-8f1ec48a3b88-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:17 crc kubenswrapper[4702]: I1203 11:31:17.389518 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.031723 4702 generic.go:334] "Generic (PLEG): container finished" podID="c11395bc-981a-4efd-9fe3-8b0c146f375e" containerID="2a5212ddaf73d3c3dfcc91bfbecb0a2453ae8ea86173460dbe17a65dd34861c3" exitCode=1 Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.032194 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b6449bd57-h4c9q" event={"ID":"c11395bc-981a-4efd-9fe3-8b0c146f375e","Type":"ContainerDied","Data":"2a5212ddaf73d3c3dfcc91bfbecb0a2453ae8ea86173460dbe17a65dd34861c3"} Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.032272 4702 scope.go:117] "RemoveContainer" containerID="e5a9100d2921a8e9494516757bf2f727566cf10e161f3d23bd914428e175bfbe" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.037509 4702 scope.go:117] "RemoveContainer" containerID="2a5212ddaf73d3c3dfcc91bfbecb0a2453ae8ea86173460dbe17a65dd34861c3" Dec 03 11:31:18 crc kubenswrapper[4702]: E1203 11:31:18.039519 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5b6449bd57-h4c9q_openstack(c11395bc-981a-4efd-9fe3-8b0c146f375e)\"" pod="openstack/heat-api-5b6449bd57-h4c9q" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.066649 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-868d57787f-bntsv" event={"ID":"57179545-ef9c-460f-9ef6-219c895dc9fa","Type":"ContainerStarted","Data":"1065868f7909f5ae3d4cea35342c5df588bb8575da1e4d60ee35f281393c5874"} Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.068264 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.068302 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.100647 4702 generic.go:334] "Generic (PLEG): container finished" podID="d921fd19-6588-42dc-9f3a-8aadb96c5996" containerID="8631b44361897ed1e7ddc389dd830642c8f76ebcc23c4626aa39695e491831e6" exitCode=1 Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.100787 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b56468cc-dnldg" event={"ID":"d921fd19-6588-42dc-9f3a-8aadb96c5996","Type":"ContainerDied","Data":"8631b44361897ed1e7ddc389dd830642c8f76ebcc23c4626aa39695e491831e6"} Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.101789 4702 scope.go:117] "RemoveContainer" containerID="8631b44361897ed1e7ddc389dd830642c8f76ebcc23c4626aa39695e491831e6" Dec 03 11:31:18 crc kubenswrapper[4702]: E1203 11:31:18.102145 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5b56468cc-dnldg_openstack(d921fd19-6588-42dc-9f3a-8aadb96c5996)\"" pod="openstack/heat-cfnapi-5b56468cc-dnldg" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.117489 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76f9dbfbb9-sz9d7" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.118560 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerStarted","Data":"6ce965086fb8779f68e410329ec3e3f06a154a00370bb4ac998ede452c085b07"} Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.120117 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-67bbbdff4c-p4d6b" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.123931 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-868d57787f-bntsv" podStartSLOduration=14.123901695 podStartE2EDuration="14.123901695s" podCreationTimestamp="2025-12-03 11:31:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:18.100390945 +0000 UTC m=+1661.936319429" watchObservedRunningTime="2025-12-03 11:31:18.123901695 +0000 UTC m=+1661.959830159" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.157124 4702 scope.go:117] "RemoveContainer" containerID="43e4067590228d6c4c544fa359b10de23e7e6d6ea28a34191965fcd7b8890b94" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.252859 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-67bbbdff4c-p4d6b"] Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.266575 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-67bbbdff4c-p4d6b"] Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.281876 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76f9dbfbb9-sz9d7"] Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.343186 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-76f9dbfbb9-sz9d7"] Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.980375 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0babb7be-7f54-4458-b44a-707bfd530e40" path="/var/lib/kubelet/pods/0babb7be-7f54-4458-b44a-707bfd530e40/volumes" Dec 03 11:31:18 crc kubenswrapper[4702]: I1203 11:31:18.981264 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2dd9093-8cd4-4627-9019-8f1ec48a3b88" path="/var/lib/kubelet/pods/d2dd9093-8cd4-4627-9019-8f1ec48a3b88/volumes" Dec 03 11:31:19 crc kubenswrapper[4702]: I1203 11:31:19.130413 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerStarted","Data":"7ca73c0787416d8a5bb6feb5d6a9524f7504c77473beb27f5e150ed8f5a78ed5"} Dec 03 11:31:19 crc kubenswrapper[4702]: I1203 11:31:19.133718 4702 scope.go:117] "RemoveContainer" containerID="2a5212ddaf73d3c3dfcc91bfbecb0a2453ae8ea86173460dbe17a65dd34861c3" Dec 03 11:31:19 crc kubenswrapper[4702]: E1203 11:31:19.134199 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5b6449bd57-h4c9q_openstack(c11395bc-981a-4efd-9fe3-8b0c146f375e)\"" pod="openstack/heat-api-5b6449bd57-h4c9q" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" Dec 03 11:31:19 crc kubenswrapper[4702]: I1203 11:31:19.137968 4702 scope.go:117] "RemoveContainer" containerID="8631b44361897ed1e7ddc389dd830642c8f76ebcc23c4626aa39695e491831e6" Dec 03 11:31:19 crc kubenswrapper[4702]: E1203 11:31:19.138233 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5b56468cc-dnldg_openstack(d921fd19-6588-42dc-9f3a-8aadb96c5996)\"" pod="openstack/heat-cfnapi-5b56468cc-dnldg" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" Dec 03 11:31:22 crc kubenswrapper[4702]: I1203 11:31:22.277117 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:22 crc kubenswrapper[4702]: I1203 11:31:22.277563 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:22 crc kubenswrapper[4702]: I1203 11:31:22.278462 4702 scope.go:117] "RemoveContainer" containerID="2a5212ddaf73d3c3dfcc91bfbecb0a2453ae8ea86173460dbe17a65dd34861c3" Dec 03 11:31:22 crc kubenswrapper[4702]: E1203 11:31:22.278777 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5b6449bd57-h4c9q_openstack(c11395bc-981a-4efd-9fe3-8b0c146f375e)\"" pod="openstack/heat-api-5b6449bd57-h4c9q" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" Dec 03 11:31:22 crc kubenswrapper[4702]: I1203 11:31:22.292891 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerStarted","Data":"ae8d6f6a79a182dd1cdf4cf4d345ef1e98c11b771ead7f30d6cccc76f6815a7d"} Dec 03 11:31:22 crc kubenswrapper[4702]: I1203 11:31:22.391917 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:22 crc kubenswrapper[4702]: I1203 11:31:22.393131 4702 scope.go:117] "RemoveContainer" containerID="8631b44361897ed1e7ddc389dd830642c8f76ebcc23c4626aa39695e491831e6" Dec 03 11:31:22 crc kubenswrapper[4702]: E1203 11:31:22.393530 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5b56468cc-dnldg_openstack(d921fd19-6588-42dc-9f3a-8aadb96c5996)\"" pod="openstack/heat-cfnapi-5b56468cc-dnldg" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" Dec 03 11:31:22 crc kubenswrapper[4702]: I1203 11:31:22.394870 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:23 crc kubenswrapper[4702]: I1203 11:31:23.317972 4702 scope.go:117] "RemoveContainer" containerID="8631b44361897ed1e7ddc389dd830642c8f76ebcc23c4626aa39695e491831e6" Dec 03 11:31:23 crc kubenswrapper[4702]: E1203 11:31:23.318385 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5b56468cc-dnldg_openstack(d921fd19-6588-42dc-9f3a-8aadb96c5996)\"" pod="openstack/heat-cfnapi-5b56468cc-dnldg" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" Dec 03 11:31:24 crc kubenswrapper[4702]: I1203 11:31:24.315552 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:31:24 crc kubenswrapper[4702]: I1203 11:31:24.332170 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerStarted","Data":"d08197bea1627cfb1b1301a190dae14c15e95680afa6c9bd740e4cfe27d49cd2"} Dec 03 11:31:24 crc kubenswrapper[4702]: I1203 11:31:24.332427 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:31:24 crc kubenswrapper[4702]: I1203 11:31:24.360282 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:31:24 crc kubenswrapper[4702]: I1203 11:31:24.397985 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5b56468cc-dnldg"] Dec 03 11:31:24 crc kubenswrapper[4702]: I1203 11:31:24.402553 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.901416858 podStartE2EDuration="9.402529797s" podCreationTimestamp="2025-12-03 11:31:15 +0000 UTC" firstStartedPulling="2025-12-03 11:31:16.787875412 +0000 UTC m=+1660.623803876" lastFinishedPulling="2025-12-03 11:31:23.288988351 +0000 UTC m=+1667.124916815" observedRunningTime="2025-12-03 11:31:24.378856612 +0000 UTC m=+1668.214785086" watchObservedRunningTime="2025-12-03 11:31:24.402529797 +0000 UTC m=+1668.238458271" Dec 03 11:31:24 crc kubenswrapper[4702]: I1203 11:31:24.507845 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5b6449bd57-h4c9q"] Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.054612 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.062470 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-868d57787f-bntsv" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.625277 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.629953 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.742852 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtf45\" (UniqueName: \"kubernetes.io/projected/d921fd19-6588-42dc-9f3a-8aadb96c5996-kube-api-access-dtf45\") pod \"d921fd19-6588-42dc-9f3a-8aadb96c5996\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.743254 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data\") pod \"d921fd19-6588-42dc-9f3a-8aadb96c5996\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.743375 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p77j\" (UniqueName: \"kubernetes.io/projected/c11395bc-981a-4efd-9fe3-8b0c146f375e-kube-api-access-5p77j\") pod \"c11395bc-981a-4efd-9fe3-8b0c146f375e\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.743409 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data-custom\") pod \"c11395bc-981a-4efd-9fe3-8b0c146f375e\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.743592 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data\") pod \"c11395bc-981a-4efd-9fe3-8b0c146f375e\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.743651 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-combined-ca-bundle\") pod \"c11395bc-981a-4efd-9fe3-8b0c146f375e\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.743692 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-combined-ca-bundle\") pod \"d921fd19-6588-42dc-9f3a-8aadb96c5996\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.743810 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data-custom\") pod \"d921fd19-6588-42dc-9f3a-8aadb96c5996\" (UID: \"d921fd19-6588-42dc-9f3a-8aadb96c5996\") " Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.757027 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c11395bc-981a-4efd-9fe3-8b0c146f375e-kube-api-access-5p77j" (OuterVolumeSpecName: "kube-api-access-5p77j") pod "c11395bc-981a-4efd-9fe3-8b0c146f375e" (UID: "c11395bc-981a-4efd-9fe3-8b0c146f375e"). InnerVolumeSpecName "kube-api-access-5p77j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.756752 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d921fd19-6588-42dc-9f3a-8aadb96c5996" (UID: "d921fd19-6588-42dc-9f3a-8aadb96c5996"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.758784 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c11395bc-981a-4efd-9fe3-8b0c146f375e" (UID: "c11395bc-981a-4efd-9fe3-8b0c146f375e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.759587 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d921fd19-6588-42dc-9f3a-8aadb96c5996-kube-api-access-dtf45" (OuterVolumeSpecName: "kube-api-access-dtf45") pod "d921fd19-6588-42dc-9f3a-8aadb96c5996" (UID: "d921fd19-6588-42dc-9f3a-8aadb96c5996"). InnerVolumeSpecName "kube-api-access-dtf45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.813266 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d921fd19-6588-42dc-9f3a-8aadb96c5996" (UID: "d921fd19-6588-42dc-9f3a-8aadb96c5996"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.820985 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c11395bc-981a-4efd-9fe3-8b0c146f375e" (UID: "c11395bc-981a-4efd-9fe3-8b0c146f375e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.888814 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data" (OuterVolumeSpecName: "config-data") pod "c11395bc-981a-4efd-9fe3-8b0c146f375e" (UID: "c11395bc-981a-4efd-9fe3-8b0c146f375e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.889554 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data\") pod \"c11395bc-981a-4efd-9fe3-8b0c146f375e\" (UID: \"c11395bc-981a-4efd-9fe3-8b0c146f375e\") " Dec 03 11:31:25 crc kubenswrapper[4702]: W1203 11:31:25.890018 4702 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c11395bc-981a-4efd-9fe3-8b0c146f375e/volumes/kubernetes.io~secret/config-data Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.890133 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data" (OuterVolumeSpecName: "config-data") pod "c11395bc-981a-4efd-9fe3-8b0c146f375e" (UID: "c11395bc-981a-4efd-9fe3-8b0c146f375e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.893149 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.893189 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtf45\" (UniqueName: \"kubernetes.io/projected/d921fd19-6588-42dc-9f3a-8aadb96c5996-kube-api-access-dtf45\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.893202 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p77j\" (UniqueName: \"kubernetes.io/projected/c11395bc-981a-4efd-9fe3-8b0c146f375e-kube-api-access-5p77j\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.893211 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.893259 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.893271 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c11395bc-981a-4efd-9fe3-8b0c146f375e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.893280 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.910263 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.910322 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.947405 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data" (OuterVolumeSpecName: "config-data") pod "d921fd19-6588-42dc-9f3a-8aadb96c5996" (UID: "d921fd19-6588-42dc-9f3a-8aadb96c5996"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:25 crc kubenswrapper[4702]: I1203 11:31:25.996822 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d921fd19-6588-42dc-9f3a-8aadb96c5996-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.699631 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5b56468cc-dnldg" event={"ID":"d921fd19-6588-42dc-9f3a-8aadb96c5996","Type":"ContainerDied","Data":"de7f62e53a2398a3d617a293302542bef4ef5e59d1f96edb59a93c74e8b671e7"} Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.700024 4702 scope.go:117] "RemoveContainer" containerID="8631b44361897ed1e7ddc389dd830642c8f76ebcc23c4626aa39695e491831e6" Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.703606 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5b56468cc-dnldg" Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.752073 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b6449bd57-h4c9q" event={"ID":"c11395bc-981a-4efd-9fe3-8b0c146f375e","Type":"ContainerDied","Data":"c8e5b4be2b92df0ef2630249a5cd4d2d7e6c728398b28fa8068cee400e588fcc"} Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.752217 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b6449bd57-h4c9q" Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.798362 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5b56468cc-dnldg"] Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.815648 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5b56468cc-dnldg"] Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.909886 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5b6449bd57-h4c9q"] Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.939235 4702 scope.go:117] "RemoveContainer" containerID="2a5212ddaf73d3c3dfcc91bfbecb0a2453ae8ea86173460dbe17a65dd34861c3" Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.987530 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" path="/var/lib/kubelet/pods/d921fd19-6588-42dc-9f3a-8aadb96c5996/volumes" Dec 03 11:31:26 crc kubenswrapper[4702]: I1203 11:31:26.994566 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5b6449bd57-h4c9q"] Dec 03 11:31:27 crc kubenswrapper[4702]: I1203 11:31:27.823346 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1b30e6f6-6da6-48ea-8e02-873d566d7719","Type":"ContainerStarted","Data":"41a8b6dd81f24d85123084d63dcb263df60588f1603b9ed2be4ee4ea8f3ad1ad"} Dec 03 11:31:27 crc kubenswrapper[4702]: I1203 11:31:27.876389 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.155160873 podStartE2EDuration="39.876360821s" podCreationTimestamp="2025-12-03 11:30:48 +0000 UTC" firstStartedPulling="2025-12-03 11:30:50.228478149 +0000 UTC m=+1634.064406613" lastFinishedPulling="2025-12-03 11:31:26.949678097 +0000 UTC m=+1670.785606561" observedRunningTime="2025-12-03 11:31:27.853684975 +0000 UTC m=+1671.689613439" watchObservedRunningTime="2025-12-03 11:31:27.876360821 +0000 UTC m=+1671.712289285" Dec 03 11:31:28 crc kubenswrapper[4702]: I1203 11:31:28.944004 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" path="/var/lib/kubelet/pods/c11395bc-981a-4efd-9fe3-8b0c146f375e/volumes" Dec 03 11:31:29 crc kubenswrapper[4702]: I1203 11:31:29.479017 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:29 crc kubenswrapper[4702]: I1203 11:31:29.479425 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="ceilometer-central-agent" containerID="cri-o://6ce965086fb8779f68e410329ec3e3f06a154a00370bb4ac998ede452c085b07" gracePeriod=30 Dec 03 11:31:29 crc kubenswrapper[4702]: I1203 11:31:29.479575 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="ceilometer-notification-agent" containerID="cri-o://7ca73c0787416d8a5bb6feb5d6a9524f7504c77473beb27f5e150ed8f5a78ed5" gracePeriod=30 Dec 03 11:31:29 crc kubenswrapper[4702]: I1203 11:31:29.479654 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="sg-core" containerID="cri-o://ae8d6f6a79a182dd1cdf4cf4d345ef1e98c11b771ead7f30d6cccc76f6815a7d" gracePeriod=30 Dec 03 11:31:29 crc kubenswrapper[4702]: I1203 11:31:29.479705 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="proxy-httpd" containerID="cri-o://d08197bea1627cfb1b1301a190dae14c15e95680afa6c9bd740e4cfe27d49cd2" gracePeriod=30 Dec 03 11:31:29 crc kubenswrapper[4702]: I1203 11:31:29.860374 4702 generic.go:334] "Generic (PLEG): container finished" podID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerID="ae8d6f6a79a182dd1cdf4cf4d345ef1e98c11b771ead7f30d6cccc76f6815a7d" exitCode=2 Dec 03 11:31:29 crc kubenswrapper[4702]: I1203 11:31:29.860452 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerDied","Data":"ae8d6f6a79a182dd1cdf4cf4d345ef1e98c11b771ead7f30d6cccc76f6815a7d"} Dec 03 11:31:30 crc kubenswrapper[4702]: I1203 11:31:30.885362 4702 generic.go:334] "Generic (PLEG): container finished" podID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerID="d08197bea1627cfb1b1301a190dae14c15e95680afa6c9bd740e4cfe27d49cd2" exitCode=0 Dec 03 11:31:30 crc kubenswrapper[4702]: I1203 11:31:30.885708 4702 generic.go:334] "Generic (PLEG): container finished" podID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerID="7ca73c0787416d8a5bb6feb5d6a9524f7504c77473beb27f5e150ed8f5a78ed5" exitCode=0 Dec 03 11:31:30 crc kubenswrapper[4702]: I1203 11:31:30.885438 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerDied","Data":"d08197bea1627cfb1b1301a190dae14c15e95680afa6c9bd740e4cfe27d49cd2"} Dec 03 11:31:30 crc kubenswrapper[4702]: I1203 11:31:30.885778 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerDied","Data":"7ca73c0787416d8a5bb6feb5d6a9524f7504c77473beb27f5e150ed8f5a78ed5"} Dec 03 11:31:32 crc kubenswrapper[4702]: I1203 11:31:32.507580 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:31:32 crc kubenswrapper[4702]: I1203 11:31:32.576189 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7bdf6cdbff-lzm8t"] Dec 03 11:31:32 crc kubenswrapper[4702]: I1203 11:31:32.576522 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" podUID="a2cf5990-eb05-4167-9d52-83186278f986" containerName="heat-engine" containerID="cri-o://899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa" gracePeriod=60 Dec 03 11:31:37 crc kubenswrapper[4702]: I1203 11:31:37.644884 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:31:37 crc kubenswrapper[4702]: I1203 11:31:37.647055 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerName="glance-log" containerID="cri-o://8c8d28104afb66581b5417efcdb44cbb8650878e8ff071009afaab0a819f8a0a" gracePeriod=30 Dec 03 11:31:37 crc kubenswrapper[4702]: I1203 11:31:37.647107 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerName="glance-httpd" containerID="cri-o://cfc85875c4a77cdd1ec4a77059a5aba6ea95607de35294a53136260dc568876a" gracePeriod=30 Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.018319 4702 generic.go:334] "Generic (PLEG): container finished" podID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerID="8c8d28104afb66581b5417efcdb44cbb8650878e8ff071009afaab0a819f8a0a" exitCode=143 Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.018676 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed571ff0-25b3-4f67-8403-c432700a8f49","Type":"ContainerDied","Data":"8c8d28104afb66581b5417efcdb44cbb8650878e8ff071009afaab0a819f8a0a"} Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.045145 4702 generic.go:334] "Generic (PLEG): container finished" podID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerID="6ce965086fb8779f68e410329ec3e3f06a154a00370bb4ac998ede452c085b07" exitCode=0 Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.045217 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerDied","Data":"6ce965086fb8779f68e410329ec3e3f06a154a00370bb4ac998ede452c085b07"} Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.310881 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.316609 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-log-httpd\") pod \"2893bcd3-157b-47f2-92c2-40b61ad8e125\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.316741 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-config-data\") pod \"2893bcd3-157b-47f2-92c2-40b61ad8e125\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.317100 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-sg-core-conf-yaml\") pod \"2893bcd3-157b-47f2-92c2-40b61ad8e125\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.317133 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-scripts\") pod \"2893bcd3-157b-47f2-92c2-40b61ad8e125\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.317152 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-combined-ca-bundle\") pod \"2893bcd3-157b-47f2-92c2-40b61ad8e125\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.317186 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9vcc\" (UniqueName: \"kubernetes.io/projected/2893bcd3-157b-47f2-92c2-40b61ad8e125-kube-api-access-x9vcc\") pod \"2893bcd3-157b-47f2-92c2-40b61ad8e125\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.317245 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-run-httpd\") pod \"2893bcd3-157b-47f2-92c2-40b61ad8e125\" (UID: \"2893bcd3-157b-47f2-92c2-40b61ad8e125\") " Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.318495 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2893bcd3-157b-47f2-92c2-40b61ad8e125" (UID: "2893bcd3-157b-47f2-92c2-40b61ad8e125"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.318967 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2893bcd3-157b-47f2-92c2-40b61ad8e125" (UID: "2893bcd3-157b-47f2-92c2-40b61ad8e125"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.335328 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-scripts" (OuterVolumeSpecName: "scripts") pod "2893bcd3-157b-47f2-92c2-40b61ad8e125" (UID: "2893bcd3-157b-47f2-92c2-40b61ad8e125"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.339259 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2893bcd3-157b-47f2-92c2-40b61ad8e125-kube-api-access-x9vcc" (OuterVolumeSpecName: "kube-api-access-x9vcc") pod "2893bcd3-157b-47f2-92c2-40b61ad8e125" (UID: "2893bcd3-157b-47f2-92c2-40b61ad8e125"). InnerVolumeSpecName "kube-api-access-x9vcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.385951 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2893bcd3-157b-47f2-92c2-40b61ad8e125" (UID: "2893bcd3-157b-47f2-92c2-40b61ad8e125"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.422475 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.422520 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.422540 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9vcc\" (UniqueName: \"kubernetes.io/projected/2893bcd3-157b-47f2-92c2-40b61ad8e125-kube-api-access-x9vcc\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.422556 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.422568 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2893bcd3-157b-47f2-92c2-40b61ad8e125-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.640984 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2893bcd3-157b-47f2-92c2-40b61ad8e125" (UID: "2893bcd3-157b-47f2-92c2-40b61ad8e125"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.648875 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-config-data" (OuterVolumeSpecName: "config-data") pod "2893bcd3-157b-47f2-92c2-40b61ad8e125" (UID: "2893bcd3-157b-47f2-92c2-40b61ad8e125"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.680830 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ht9tr"] Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681562 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="sg-core" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681591 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="sg-core" Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681610 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681619 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681629 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681636 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681653 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="proxy-httpd" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681661 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="proxy-httpd" Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681676 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0babb7be-7f54-4458-b44a-707bfd530e40" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681682 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0babb7be-7f54-4458-b44a-707bfd530e40" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681695 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681702 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681717 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2dd9093-8cd4-4627-9019-8f1ec48a3b88" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681725 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2dd9093-8cd4-4627-9019-8f1ec48a3b88" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681740 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="ceilometer-notification-agent" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681748 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="ceilometer-notification-agent" Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.681800 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="ceilometer-central-agent" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.681811 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="ceilometer-central-agent" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682088 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682104 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682125 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="proxy-httpd" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682139 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682152 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="d921fd19-6588-42dc-9f3a-8aadb96c5996" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682175 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2dd9093-8cd4-4627-9019-8f1ec48a3b88" containerName="heat-cfnapi" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682190 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="ceilometer-notification-agent" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682209 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="0babb7be-7f54-4458-b44a-707bfd530e40" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682221 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="sg-core" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.682241 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" containerName="ceilometer-central-agent" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.683379 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.696671 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.696716 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2893bcd3-157b-47f2-92c2-40b61ad8e125-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.734073 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ht9tr"] Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.801785 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snwmm\" (UniqueName: \"kubernetes.io/projected/45a1ae25-65af-406f-8c00-4f7b228b2876-kube-api-access-snwmm\") pod \"nova-api-db-create-ht9tr\" (UID: \"45a1ae25-65af-406f-8c00-4f7b228b2876\") " pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.802193 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a1ae25-65af-406f-8c00-4f7b228b2876-operator-scripts\") pod \"nova-api-db-create-ht9tr\" (UID: \"45a1ae25-65af-406f-8c00-4f7b228b2876\") " pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.825197 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jt55p"] Dec 03 11:31:38 crc kubenswrapper[4702]: E1203 11:31:38.825896 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.825912 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c11395bc-981a-4efd-9fe3-8b0c146f375e" containerName="heat-api" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.827046 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.836306 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jt55p"] Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.909162 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a1ae25-65af-406f-8c00-4f7b228b2876-operator-scripts\") pod \"nova-api-db-create-ht9tr\" (UID: \"45a1ae25-65af-406f-8c00-4f7b228b2876\") " pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.909462 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db35177e-78d7-4761-a3c9-cd6bfafce10b-operator-scripts\") pod \"nova-cell0-db-create-jt55p\" (UID: \"db35177e-78d7-4761-a3c9-cd6bfafce10b\") " pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.909550 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f59rh\" (UniqueName: \"kubernetes.io/projected/db35177e-78d7-4761-a3c9-cd6bfafce10b-kube-api-access-f59rh\") pod \"nova-cell0-db-create-jt55p\" (UID: \"db35177e-78d7-4761-a3c9-cd6bfafce10b\") " pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.909772 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snwmm\" (UniqueName: \"kubernetes.io/projected/45a1ae25-65af-406f-8c00-4f7b228b2876-kube-api-access-snwmm\") pod \"nova-api-db-create-ht9tr\" (UID: \"45a1ae25-65af-406f-8c00-4f7b228b2876\") " pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.911299 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a1ae25-65af-406f-8c00-4f7b228b2876-operator-scripts\") pod \"nova-api-db-create-ht9tr\" (UID: \"45a1ae25-65af-406f-8c00-4f7b228b2876\") " pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:38 crc kubenswrapper[4702]: I1203 11:31:38.976096 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snwmm\" (UniqueName: \"kubernetes.io/projected/45a1ae25-65af-406f-8c00-4f7b228b2876-kube-api-access-snwmm\") pod \"nova-api-db-create-ht9tr\" (UID: \"45a1ae25-65af-406f-8c00-4f7b228b2876\") " pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.012155 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db35177e-78d7-4761-a3c9-cd6bfafce10b-operator-scripts\") pod \"nova-cell0-db-create-jt55p\" (UID: \"db35177e-78d7-4761-a3c9-cd6bfafce10b\") " pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.012225 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f59rh\" (UniqueName: \"kubernetes.io/projected/db35177e-78d7-4761-a3c9-cd6bfafce10b-kube-api-access-f59rh\") pod \"nova-cell0-db-create-jt55p\" (UID: \"db35177e-78d7-4761-a3c9-cd6bfafce10b\") " pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.013601 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db35177e-78d7-4761-a3c9-cd6bfafce10b-operator-scripts\") pod \"nova-cell0-db-create-jt55p\" (UID: \"db35177e-78d7-4761-a3c9-cd6bfafce10b\") " pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.045363 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-579b-account-create-update-28tr7"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.048297 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.070485 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f59rh\" (UniqueName: \"kubernetes.io/projected/db35177e-78d7-4761-a3c9-cd6bfafce10b-kube-api-access-f59rh\") pod \"nova-cell0-db-create-jt55p\" (UID: \"db35177e-78d7-4761-a3c9-cd6bfafce10b\") " pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.070713 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.190893 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.207277 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.207557 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-gbrfb"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.217738 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.223207 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2893bcd3-157b-47f2-92c2-40b61ad8e125","Type":"ContainerDied","Data":"12347b7e4700df6e9fa03e76a542876a9cfa983247dcd333f63d4403a38c5c11"} Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.223291 4702 scope.go:117] "RemoveContainer" containerID="d08197bea1627cfb1b1301a190dae14c15e95680afa6c9bd740e4cfe27d49cd2" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.223558 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.286269 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-operator-scripts\") pod \"nova-api-579b-account-create-update-28tr7\" (UID: \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\") " pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.286531 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8rb4\" (UniqueName: \"kubernetes.io/projected/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-kube-api-access-l8rb4\") pod \"nova-api-579b-account-create-update-28tr7\" (UID: \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\") " pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.305394 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gbrfb"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.334354 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-579b-account-create-update-28tr7"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.389544 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8rb4\" (UniqueName: \"kubernetes.io/projected/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-kube-api-access-l8rb4\") pod \"nova-api-579b-account-create-update-28tr7\" (UID: \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\") " pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.389891 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-operator-scripts\") pod \"nova-api-579b-account-create-update-28tr7\" (UID: \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\") " pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.389943 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsvn\" (UniqueName: \"kubernetes.io/projected/610ca85f-6b9d-4275-852e-aa9424ce4066-kube-api-access-xbsvn\") pod \"nova-cell1-db-create-gbrfb\" (UID: \"610ca85f-6b9d-4275-852e-aa9424ce4066\") " pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.389996 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610ca85f-6b9d-4275-852e-aa9424ce4066-operator-scripts\") pod \"nova-cell1-db-create-gbrfb\" (UID: \"610ca85f-6b9d-4275-852e-aa9424ce4066\") " pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.394200 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-operator-scripts\") pod \"nova-api-579b-account-create-update-28tr7\" (UID: \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\") " pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.406844 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-bf50-account-create-update-8zqck"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.409177 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.426778 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.444061 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bf50-account-create-update-8zqck"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.445831 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8rb4\" (UniqueName: \"kubernetes.io/projected/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-kube-api-access-l8rb4\") pod \"nova-api-579b-account-create-update-28tr7\" (UID: \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\") " pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.471896 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0687-account-create-update-zw8qt"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.474319 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.477558 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.492332 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsvn\" (UniqueName: \"kubernetes.io/projected/610ca85f-6b9d-4275-852e-aa9424ce4066-kube-api-access-xbsvn\") pod \"nova-cell1-db-create-gbrfb\" (UID: \"610ca85f-6b9d-4275-852e-aa9424ce4066\") " pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.492415 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610ca85f-6b9d-4275-852e-aa9424ce4066-operator-scripts\") pod \"nova-cell1-db-create-gbrfb\" (UID: \"610ca85f-6b9d-4275-852e-aa9424ce4066\") " pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.492534 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5zpd\" (UniqueName: \"kubernetes.io/projected/2616bfea-114e-4447-bed8-719c64679287-kube-api-access-z5zpd\") pod \"nova-cell0-bf50-account-create-update-8zqck\" (UID: \"2616bfea-114e-4447-bed8-719c64679287\") " pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.492566 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2616bfea-114e-4447-bed8-719c64679287-operator-scripts\") pod \"nova-cell0-bf50-account-create-update-8zqck\" (UID: \"2616bfea-114e-4447-bed8-719c64679287\") " pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.494167 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610ca85f-6b9d-4275-852e-aa9424ce4066-operator-scripts\") pod \"nova-cell1-db-create-gbrfb\" (UID: \"610ca85f-6b9d-4275-852e-aa9424ce4066\") " pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.514386 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0687-account-create-update-zw8qt"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.535782 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsvn\" (UniqueName: \"kubernetes.io/projected/610ca85f-6b9d-4275-852e-aa9424ce4066-kube-api-access-xbsvn\") pod \"nova-cell1-db-create-gbrfb\" (UID: \"610ca85f-6b9d-4275-852e-aa9424ce4066\") " pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.594615 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5zpd\" (UniqueName: \"kubernetes.io/projected/2616bfea-114e-4447-bed8-719c64679287-kube-api-access-z5zpd\") pod \"nova-cell0-bf50-account-create-update-8zqck\" (UID: \"2616bfea-114e-4447-bed8-719c64679287\") " pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.594667 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2616bfea-114e-4447-bed8-719c64679287-operator-scripts\") pod \"nova-cell0-bf50-account-create-update-8zqck\" (UID: \"2616bfea-114e-4447-bed8-719c64679287\") " pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.594775 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmtsd\" (UniqueName: \"kubernetes.io/projected/46c8d995-9bc9-464f-b388-aebbef2024fd-kube-api-access-pmtsd\") pod \"nova-cell1-0687-account-create-update-zw8qt\" (UID: \"46c8d995-9bc9-464f-b388-aebbef2024fd\") " pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.594890 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c8d995-9bc9-464f-b388-aebbef2024fd-operator-scripts\") pod \"nova-cell1-0687-account-create-update-zw8qt\" (UID: \"46c8d995-9bc9-464f-b388-aebbef2024fd\") " pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.595824 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2616bfea-114e-4447-bed8-719c64679287-operator-scripts\") pod \"nova-cell0-bf50-account-create-update-8zqck\" (UID: \"2616bfea-114e-4447-bed8-719c64679287\") " pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.666213 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5zpd\" (UniqueName: \"kubernetes.io/projected/2616bfea-114e-4447-bed8-719c64679287-kube-api-access-z5zpd\") pod \"nova-cell0-bf50-account-create-update-8zqck\" (UID: \"2616bfea-114e-4447-bed8-719c64679287\") " pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.697951 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmtsd\" (UniqueName: \"kubernetes.io/projected/46c8d995-9bc9-464f-b388-aebbef2024fd-kube-api-access-pmtsd\") pod \"nova-cell1-0687-account-create-update-zw8qt\" (UID: \"46c8d995-9bc9-464f-b388-aebbef2024fd\") " pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.698484 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c8d995-9bc9-464f-b388-aebbef2024fd-operator-scripts\") pod \"nova-cell1-0687-account-create-update-zw8qt\" (UID: \"46c8d995-9bc9-464f-b388-aebbef2024fd\") " pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.699965 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c8d995-9bc9-464f-b388-aebbef2024fd-operator-scripts\") pod \"nova-cell1-0687-account-create-update-zw8qt\" (UID: \"46c8d995-9bc9-464f-b388-aebbef2024fd\") " pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.725538 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmtsd\" (UniqueName: \"kubernetes.io/projected/46c8d995-9bc9-464f-b388-aebbef2024fd-kube-api-access-pmtsd\") pod \"nova-cell1-0687-account-create-update-zw8qt\" (UID: \"46c8d995-9bc9-464f-b388-aebbef2024fd\") " pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.805933 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.851022 4702 scope.go:117] "RemoveContainer" containerID="ae8d6f6a79a182dd1cdf4cf4d345ef1e98c11b771ead7f30d6cccc76f6815a7d" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.856416 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.880124 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.888449 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.907022 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.907592 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.924082 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.929600 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.929708 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.933611 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.934029 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:31:39 crc kubenswrapper[4702]: I1203 11:31:39.970690 4702 scope.go:117] "RemoveContainer" containerID="7ca73c0787416d8a5bb6feb5d6a9524f7504c77473beb27f5e150ed8f5a78ed5" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.015314 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-log-httpd\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.015671 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqxdp\" (UniqueName: \"kubernetes.io/projected/9baeea95-5794-4b3a-b239-580dc8391f6d-kube-api-access-nqxdp\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.015995 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.016094 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.016244 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-scripts\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.016434 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-run-httpd\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.016541 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-config-data\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.118832 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-scripts\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.122319 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-run-httpd\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.122529 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-config-data\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.122733 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-log-httpd\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.122978 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqxdp\" (UniqueName: \"kubernetes.io/projected/9baeea95-5794-4b3a-b239-580dc8391f6d-kube-api-access-nqxdp\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.123306 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.123342 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.124074 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-log-httpd\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.124516 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-run-httpd\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.132660 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-scripts\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.134230 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.134959 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.140167 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-config-data\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.155973 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqxdp\" (UniqueName: \"kubernetes.io/projected/9baeea95-5794-4b3a-b239-580dc8391f6d-kube-api-access-nqxdp\") pod \"ceilometer-0\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.180430 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jt55p"] Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.258722 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ht9tr"] Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.287420 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.381284 4702 scope.go:117] "RemoveContainer" containerID="6ce965086fb8779f68e410329ec3e3f06a154a00370bb4ac998ede452c085b07" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.596570 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-579b-account-create-update-28tr7"] Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.870349 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0687-account-create-update-zw8qt"] Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.952070 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2893bcd3-157b-47f2-92c2-40b61ad8e125" path="/var/lib/kubelet/pods/2893bcd3-157b-47f2-92c2-40b61ad8e125/volumes" Dec 03 11:31:40 crc kubenswrapper[4702]: I1203 11:31:40.954195 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-gbrfb"] Dec 03 11:31:41 crc kubenswrapper[4702]: W1203 11:31:41.001424 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46c8d995_9bc9_464f_b388_aebbef2024fd.slice/crio-5bc9c4f26e0e0ffb180afade22616694c282ed9c45e018aa00a28b8ca793be69 WatchSource:0}: Error finding container 5bc9c4f26e0e0ffb180afade22616694c282ed9c45e018aa00a28b8ca793be69: Status 404 returned error can't find the container with id 5bc9c4f26e0e0ffb180afade22616694c282ed9c45e018aa00a28b8ca793be69 Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.133672 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-bf50-account-create-update-8zqck"] Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.420890 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-579b-account-create-update-28tr7" event={"ID":"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f","Type":"ContainerStarted","Data":"88c604167570120d4ba911ebcef24496a72634d6e908a9c564411deb0b36725c"} Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.424912 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jt55p" event={"ID":"db35177e-78d7-4761-a3c9-cd6bfafce10b","Type":"ContainerStarted","Data":"d3c9b888696e6297348f0149041229a874bd2ac83972558dfdd75d26a342a80b"} Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.436819 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ht9tr" event={"ID":"45a1ae25-65af-406f-8c00-4f7b228b2876","Type":"ContainerStarted","Data":"107ab4c5ef257ec43e883f5e4db934ae5b0a7825608942734485f754d7595c3c"} Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.436952 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ht9tr" event={"ID":"45a1ae25-65af-406f-8c00-4f7b228b2876","Type":"ContainerStarted","Data":"9a73e741e863cdb79ddafba5d8d38511b8c7e7f39fdf808421481ccd58e14528"} Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.443261 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gbrfb" event={"ID":"610ca85f-6b9d-4275-852e-aa9424ce4066","Type":"ContainerStarted","Data":"092990d9f728de92b295f484a6bc087cac43a4013954bcfa96352dbecd96f4bd"} Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.451546 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.477469 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-ht9tr" podStartSLOduration=3.477373925 podStartE2EDuration="3.477373925s" podCreationTimestamp="2025-12-03 11:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:41.463394637 +0000 UTC m=+1685.299323101" watchObservedRunningTime="2025-12-03 11:31:41.477373925 +0000 UTC m=+1685.313302399" Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.481341 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" event={"ID":"46c8d995-9bc9-464f-b388-aebbef2024fd","Type":"ContainerStarted","Data":"5bc9c4f26e0e0ffb180afade22616694c282ed9c45e018aa00a28b8ca793be69"} Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.500065 4702 generic.go:334] "Generic (PLEG): container finished" podID="a2cf5990-eb05-4167-9d52-83186278f986" containerID="899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa" exitCode=0 Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.500167 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" event={"ID":"a2cf5990-eb05-4167-9d52-83186278f986","Type":"ContainerDied","Data":"899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa"} Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.511199 4702 generic.go:334] "Generic (PLEG): container finished" podID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerID="cfc85875c4a77cdd1ec4a77059a5aba6ea95607de35294a53136260dc568876a" exitCode=0 Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.511367 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed571ff0-25b3-4f67-8403-c432700a8f49","Type":"ContainerDied","Data":"cfc85875c4a77cdd1ec4a77059a5aba6ea95607de35294a53136260dc568876a"} Dec 03 11:31:41 crc kubenswrapper[4702]: I1203 11:31:41.514992 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" event={"ID":"2616bfea-114e-4447-bed8-719c64679287","Type":"ContainerStarted","Data":"54733c3cc7c04b18abfbd2adc0624a47539898bfb00030f5828fd21daf434f4e"} Dec 03 11:31:42 crc kubenswrapper[4702]: E1203 11:31:42.345983 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa is running failed: container process not found" containerID="899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:31:42 crc kubenswrapper[4702]: E1203 11:31:42.347821 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa is running failed: container process not found" containerID="899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:31:42 crc kubenswrapper[4702]: E1203 11:31:42.348631 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa is running failed: container process not found" containerID="899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:31:42 crc kubenswrapper[4702]: E1203 11:31:42.348738 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" podUID="a2cf5990-eb05-4167-9d52-83186278f986" containerName="heat-engine" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.538783 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.541439 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.549001 4702 generic.go:334] "Generic (PLEG): container finished" podID="610ca85f-6b9d-4275-852e-aa9424ce4066" containerID="125ad852b7afb7358079f8c98a362f30cf5ddc4e950f8cf0220fa283814cbbb2" exitCode=0 Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.549105 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gbrfb" event={"ID":"610ca85f-6b9d-4275-852e-aa9424ce4066","Type":"ContainerDied","Data":"125ad852b7afb7358079f8c98a362f30cf5ddc4e950f8cf0220fa283814cbbb2"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552051 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-logs\") pod \"ed571ff0-25b3-4f67-8403-c432700a8f49\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552167 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cw4tb\" (UniqueName: \"kubernetes.io/projected/a2cf5990-eb05-4167-9d52-83186278f986-kube-api-access-cw4tb\") pod \"a2cf5990-eb05-4167-9d52-83186278f986\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552204 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-config-data\") pod \"ed571ff0-25b3-4f67-8403-c432700a8f49\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552241 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz74s\" (UniqueName: \"kubernetes.io/projected/ed571ff0-25b3-4f67-8403-c432700a8f49-kube-api-access-lz74s\") pod \"ed571ff0-25b3-4f67-8403-c432700a8f49\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552271 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data\") pod \"a2cf5990-eb05-4167-9d52-83186278f986\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552341 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data-custom\") pod \"a2cf5990-eb05-4167-9d52-83186278f986\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552379 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-scripts\") pod \"ed571ff0-25b3-4f67-8403-c432700a8f49\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552465 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-combined-ca-bundle\") pod \"a2cf5990-eb05-4167-9d52-83186278f986\" (UID: \"a2cf5990-eb05-4167-9d52-83186278f986\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552491 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-internal-tls-certs\") pod \"ed571ff0-25b3-4f67-8403-c432700a8f49\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552520 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-combined-ca-bundle\") pod \"ed571ff0-25b3-4f67-8403-c432700a8f49\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552576 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ed571ff0-25b3-4f67-8403-c432700a8f49\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552609 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-httpd-run\") pod \"ed571ff0-25b3-4f67-8403-c432700a8f49\" (UID: \"ed571ff0-25b3-4f67-8403-c432700a8f49\") " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.552719 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-logs" (OuterVolumeSpecName: "logs") pod "ed571ff0-25b3-4f67-8403-c432700a8f49" (UID: "ed571ff0-25b3-4f67-8403-c432700a8f49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.553180 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.553498 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ed571ff0-25b3-4f67-8403-c432700a8f49" (UID: "ed571ff0-25b3-4f67-8403-c432700a8f49"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.563741 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a2cf5990-eb05-4167-9d52-83186278f986" (UID: "a2cf5990-eb05-4167-9d52-83186278f986"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.564705 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "ed571ff0-25b3-4f67-8403-c432700a8f49" (UID: "ed571ff0-25b3-4f67-8403-c432700a8f49"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.564785 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2cf5990-eb05-4167-9d52-83186278f986-kube-api-access-cw4tb" (OuterVolumeSpecName: "kube-api-access-cw4tb") pod "a2cf5990-eb05-4167-9d52-83186278f986" (UID: "a2cf5990-eb05-4167-9d52-83186278f986"). InnerVolumeSpecName "kube-api-access-cw4tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.566235 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-scripts" (OuterVolumeSpecName: "scripts") pod "ed571ff0-25b3-4f67-8403-c432700a8f49" (UID: "ed571ff0-25b3-4f67-8403-c432700a8f49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.566519 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" event={"ID":"46c8d995-9bc9-464f-b388-aebbef2024fd","Type":"ContainerStarted","Data":"11330477ddf2a7c05ecc4e4365e5881a85cf6fc3b86296d279f712222fd902cf"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.577778 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.577779 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7bdf6cdbff-lzm8t" event={"ID":"a2cf5990-eb05-4167-9d52-83186278f986","Type":"ContainerDied","Data":"966cb54ffe2da9fb902297d512e09e2773f3ef4704a46031923cc407ef5eba0a"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.577894 4702 scope.go:117] "RemoveContainer" containerID="899fa405f3f2ffc578c63afaae042c6941dbfc515c00fbea8f061bb095bdc6fa" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.594333 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed571ff0-25b3-4f67-8403-c432700a8f49-kube-api-access-lz74s" (OuterVolumeSpecName: "kube-api-access-lz74s") pod "ed571ff0-25b3-4f67-8403-c432700a8f49" (UID: "ed571ff0-25b3-4f67-8403-c432700a8f49"). InnerVolumeSpecName "kube-api-access-lz74s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.602542 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-579b-account-create-update-28tr7" event={"ID":"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f","Type":"ContainerStarted","Data":"d9f0317e197fd7b4d06bb06090e5cf52a72b0704df67fc3269816ecef8e01ab7"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.613714 4702 generic.go:334] "Generic (PLEG): container finished" podID="45a1ae25-65af-406f-8c00-4f7b228b2876" containerID="107ab4c5ef257ec43e883f5e4db934ae5b0a7825608942734485f754d7595c3c" exitCode=0 Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.613831 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ht9tr" event={"ID":"45a1ae25-65af-406f-8c00-4f7b228b2876","Type":"ContainerDied","Data":"107ab4c5ef257ec43e883f5e4db934ae5b0a7825608942734485f754d7595c3c"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.615644 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerStarted","Data":"716e12d4ff7fa429dd23fc15e91cbf271570f54c07bf0b39102d5695788ffaf7"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.618550 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2cf5990-eb05-4167-9d52-83186278f986" (UID: "a2cf5990-eb05-4167-9d52-83186278f986"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.664319 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ed571ff0-25b3-4f67-8403-c432700a8f49","Type":"ContainerDied","Data":"7bf781867430dfb84d7891e38390c6c609993ccd908d24071f43b98334163459"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.664644 4702 scope.go:117] "RemoveContainer" containerID="cfc85875c4a77cdd1ec4a77059a5aba6ea95607de35294a53136260dc568876a" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.664922 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.670886 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz74s\" (UniqueName: \"kubernetes.io/projected/ed571ff0-25b3-4f67-8403-c432700a8f49-kube-api-access-lz74s\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.670937 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.670951 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.670962 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.671032 4702 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.671048 4702 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ed571ff0-25b3-4f67-8403-c432700a8f49-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.671061 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cw4tb\" (UniqueName: \"kubernetes.io/projected/a2cf5990-eb05-4167-9d52-83186278f986-kube-api-access-cw4tb\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.741752 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" event={"ID":"2616bfea-114e-4447-bed8-719c64679287","Type":"ContainerStarted","Data":"27ba3fa90071b8ed7866541539dfabec595a888342c39ed65b0729593413d739"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.744941 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed571ff0-25b3-4f67-8403-c432700a8f49" (UID: "ed571ff0-25b3-4f67-8403-c432700a8f49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.782011 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.806273 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jt55p" event={"ID":"db35177e-78d7-4761-a3c9-cd6bfafce10b","Type":"ContainerStarted","Data":"2707f5a014f744787676f29c42e6d836b6e70ed12c7407fa7077df56de18d061"} Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.867966 4702 scope.go:117] "RemoveContainer" containerID="8c8d28104afb66581b5417efcdb44cbb8650878e8ff071009afaab0a819f8a0a" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.897860 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" podStartSLOduration=3.897828863 podStartE2EDuration="3.897828863s" podCreationTimestamp="2025-12-03 11:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:42.67761589 +0000 UTC m=+1686.513544354" watchObservedRunningTime="2025-12-03 11:31:42.897828863 +0000 UTC m=+1686.733757327" Dec 03 11:31:42 crc kubenswrapper[4702]: I1203 11:31:42.977240 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-579b-account-create-update-28tr7" podStartSLOduration=4.9772208639999995 podStartE2EDuration="4.977220864s" podCreationTimestamp="2025-12-03 11:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:42.749516078 +0000 UTC m=+1686.585444552" watchObservedRunningTime="2025-12-03 11:31:42.977220864 +0000 UTC m=+1686.813149328" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.021156 4702 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.023075 4702 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.084371 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" podStartSLOduration=4.084340385 podStartE2EDuration="4.084340385s" podCreationTimestamp="2025-12-03 11:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:42.825890294 +0000 UTC m=+1686.661818758" watchObservedRunningTime="2025-12-03 11:31:43.084340385 +0000 UTC m=+1686.920268849" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.113515 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-jt55p" podStartSLOduration=5.113488215 podStartE2EDuration="5.113488215s" podCreationTimestamp="2025-12-03 11:31:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:42.868498637 +0000 UTC m=+1686.704427101" watchObservedRunningTime="2025-12-03 11:31:43.113488215 +0000 UTC m=+1686.949416679" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.136888 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data" (OuterVolumeSpecName: "config-data") pod "a2cf5990-eb05-4167-9d52-83186278f986" (UID: "a2cf5990-eb05-4167-9d52-83186278f986"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.166693 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-config-data" (OuterVolumeSpecName: "config-data") pod "ed571ff0-25b3-4f67-8403-c432700a8f49" (UID: "ed571ff0-25b3-4f67-8403-c432700a8f49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.230182 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.230229 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2cf5990-eb05-4167-9d52-83186278f986-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.232980 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ed571ff0-25b3-4f67-8403-c432700a8f49" (UID: "ed571ff0-25b3-4f67-8403-c432700a8f49"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.332639 4702 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed571ff0-25b3-4f67-8403-c432700a8f49-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.542912 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.560336 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.576264 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7bdf6cdbff-lzm8t"] Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.598512 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7bdf6cdbff-lzm8t"] Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.618269 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:31:43 crc kubenswrapper[4702]: E1203 11:31:43.619023 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2cf5990-eb05-4167-9d52-83186278f986" containerName="heat-engine" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.619042 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2cf5990-eb05-4167-9d52-83186278f986" containerName="heat-engine" Dec 03 11:31:43 crc kubenswrapper[4702]: E1203 11:31:43.619052 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerName="glance-log" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.619059 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerName="glance-log" Dec 03 11:31:43 crc kubenswrapper[4702]: E1203 11:31:43.619119 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerName="glance-httpd" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.619129 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerName="glance-httpd" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.619357 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerName="glance-httpd" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.619374 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" containerName="glance-log" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.619386 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2cf5990-eb05-4167-9d52-83186278f986" containerName="heat-engine" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.620849 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.624307 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.624342 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.631990 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.747333 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.747836 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.747923 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.747974 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2qm\" (UniqueName: \"kubernetes.io/projected/381fd572-826d-4e69-ad36-f90b539f21ab-kube-api-access-rp2qm\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.748107 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.748209 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/381fd572-826d-4e69-ad36-f90b539f21ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.748257 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/381fd572-826d-4e69-ad36-f90b539f21ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.748411 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.822650 4702 generic.go:334] "Generic (PLEG): container finished" podID="46c8d995-9bc9-464f-b388-aebbef2024fd" containerID="11330477ddf2a7c05ecc4e4365e5881a85cf6fc3b86296d279f712222fd902cf" exitCode=0 Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.822775 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" event={"ID":"46c8d995-9bc9-464f-b388-aebbef2024fd","Type":"ContainerDied","Data":"11330477ddf2a7c05ecc4e4365e5881a85cf6fc3b86296d279f712222fd902cf"} Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.826792 4702 generic.go:334] "Generic (PLEG): container finished" podID="2616bfea-114e-4447-bed8-719c64679287" containerID="27ba3fa90071b8ed7866541539dfabec595a888342c39ed65b0729593413d739" exitCode=0 Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.826860 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" event={"ID":"2616bfea-114e-4447-bed8-719c64679287","Type":"ContainerDied","Data":"27ba3fa90071b8ed7866541539dfabec595a888342c39ed65b0729593413d739"} Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.828989 4702 generic.go:334] "Generic (PLEG): container finished" podID="db35177e-78d7-4761-a3c9-cd6bfafce10b" containerID="2707f5a014f744787676f29c42e6d836b6e70ed12c7407fa7077df56de18d061" exitCode=0 Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.829056 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jt55p" event={"ID":"db35177e-78d7-4761-a3c9-cd6bfafce10b","Type":"ContainerDied","Data":"2707f5a014f744787676f29c42e6d836b6e70ed12c7407fa7077df56de18d061"} Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.832162 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerStarted","Data":"2c24a2fd52bab6bfe68a8534d54940ead8a1fdc2390be608c0d6cf45ab595657"} Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.865581 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.865741 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.865836 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2qm\" (UniqueName: \"kubernetes.io/projected/381fd572-826d-4e69-ad36-f90b539f21ab-kube-api-access-rp2qm\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.865960 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.866105 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/381fd572-826d-4e69-ad36-f90b539f21ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.866152 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/381fd572-826d-4e69-ad36-f90b539f21ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.866193 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.866382 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.868389 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.873135 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/381fd572-826d-4e69-ad36-f90b539f21ab-logs\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.873212 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/381fd572-826d-4e69-ad36-f90b539f21ab-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.891168 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.893160 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-config-data\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.907587 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-scripts\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.942785 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2qm\" (UniqueName: \"kubernetes.io/projected/381fd572-826d-4e69-ad36-f90b539f21ab-kube-api-access-rp2qm\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.944646 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/381fd572-826d-4e69-ad36-f90b539f21ab-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:43 crc kubenswrapper[4702]: I1203 11:31:43.952692 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"381fd572-826d-4e69-ad36-f90b539f21ab\") " pod="openstack/glance-default-internal-api-0" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.248125 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.320163 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.502092 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a1ae25-65af-406f-8c00-4f7b228b2876-operator-scripts\") pod \"45a1ae25-65af-406f-8c00-4f7b228b2876\" (UID: \"45a1ae25-65af-406f-8c00-4f7b228b2876\") " Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.502596 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snwmm\" (UniqueName: \"kubernetes.io/projected/45a1ae25-65af-406f-8c00-4f7b228b2876-kube-api-access-snwmm\") pod \"45a1ae25-65af-406f-8c00-4f7b228b2876\" (UID: \"45a1ae25-65af-406f-8c00-4f7b228b2876\") " Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.503130 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45a1ae25-65af-406f-8c00-4f7b228b2876-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45a1ae25-65af-406f-8c00-4f7b228b2876" (UID: "45a1ae25-65af-406f-8c00-4f7b228b2876"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.503731 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45a1ae25-65af-406f-8c00-4f7b228b2876-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.514078 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45a1ae25-65af-406f-8c00-4f7b228b2876-kube-api-access-snwmm" (OuterVolumeSpecName: "kube-api-access-snwmm") pod "45a1ae25-65af-406f-8c00-4f7b228b2876" (UID: "45a1ae25-65af-406f-8c00-4f7b228b2876"). InnerVolumeSpecName "kube-api-access-snwmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.611568 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snwmm\" (UniqueName: \"kubernetes.io/projected/45a1ae25-65af-406f-8c00-4f7b228b2876-kube-api-access-snwmm\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.655348 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.815627 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbsvn\" (UniqueName: \"kubernetes.io/projected/610ca85f-6b9d-4275-852e-aa9424ce4066-kube-api-access-xbsvn\") pod \"610ca85f-6b9d-4275-852e-aa9424ce4066\" (UID: \"610ca85f-6b9d-4275-852e-aa9424ce4066\") " Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.816008 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610ca85f-6b9d-4275-852e-aa9424ce4066-operator-scripts\") pod \"610ca85f-6b9d-4275-852e-aa9424ce4066\" (UID: \"610ca85f-6b9d-4275-852e-aa9424ce4066\") " Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.816844 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/610ca85f-6b9d-4275-852e-aa9424ce4066-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "610ca85f-6b9d-4275-852e-aa9424ce4066" (UID: "610ca85f-6b9d-4275-852e-aa9424ce4066"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.822346 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/610ca85f-6b9d-4275-852e-aa9424ce4066-kube-api-access-xbsvn" (OuterVolumeSpecName: "kube-api-access-xbsvn") pod "610ca85f-6b9d-4275-852e-aa9424ce4066" (UID: "610ca85f-6b9d-4275-852e-aa9424ce4066"). InnerVolumeSpecName "kube-api-access-xbsvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.855577 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ht9tr" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.859748 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ht9tr" event={"ID":"45a1ae25-65af-406f-8c00-4f7b228b2876","Type":"ContainerDied","Data":"9a73e741e863cdb79ddafba5d8d38511b8c7e7f39fdf808421481ccd58e14528"} Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.859845 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a73e741e863cdb79ddafba5d8d38511b8c7e7f39fdf808421481ccd58e14528" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.864439 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerStarted","Data":"ac829dd26e46bb8b73f2632aa8f67ebca3cf41405a73c6c61b4b44ad8d9cc5a0"} Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.868463 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-gbrfb" event={"ID":"610ca85f-6b9d-4275-852e-aa9424ce4066","Type":"ContainerDied","Data":"092990d9f728de92b295f484a6bc087cac43a4013954bcfa96352dbecd96f4bd"} Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.868525 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="092990d9f728de92b295f484a6bc087cac43a4013954bcfa96352dbecd96f4bd" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.868604 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-gbrfb" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.873958 4702 generic.go:334] "Generic (PLEG): container finished" podID="bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f" containerID="d9f0317e197fd7b4d06bb06090e5cf52a72b0704df67fc3269816ecef8e01ab7" exitCode=0 Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.874047 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-579b-account-create-update-28tr7" event={"ID":"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f","Type":"ContainerDied","Data":"d9f0317e197fd7b4d06bb06090e5cf52a72b0704df67fc3269816ecef8e01ab7"} Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.919413 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbsvn\" (UniqueName: \"kubernetes.io/projected/610ca85f-6b9d-4275-852e-aa9424ce4066-kube-api-access-xbsvn\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.919459 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/610ca85f-6b9d-4275-852e-aa9424ce4066-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.949500 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2cf5990-eb05-4167-9d52-83186278f986" path="/var/lib/kubelet/pods/a2cf5990-eb05-4167-9d52-83186278f986/volumes" Dec 03 11:31:44 crc kubenswrapper[4702]: I1203 11:31:44.954880 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed571ff0-25b3-4f67-8403-c432700a8f49" path="/var/lib/kubelet/pods/ed571ff0-25b3-4f67-8403-c432700a8f49/volumes" Dec 03 11:31:45 crc kubenswrapper[4702]: I1203 11:31:45.329641 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 11:31:45 crc kubenswrapper[4702]: I1203 11:31:45.820606 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:45 crc kubenswrapper[4702]: I1203 11:31:45.987661 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f59rh\" (UniqueName: \"kubernetes.io/projected/db35177e-78d7-4761-a3c9-cd6bfafce10b-kube-api-access-f59rh\") pod \"db35177e-78d7-4761-a3c9-cd6bfafce10b\" (UID: \"db35177e-78d7-4761-a3c9-cd6bfafce10b\") " Dec 03 11:31:45 crc kubenswrapper[4702]: I1203 11:31:45.987910 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db35177e-78d7-4761-a3c9-cd6bfafce10b-operator-scripts\") pod \"db35177e-78d7-4761-a3c9-cd6bfafce10b\" (UID: \"db35177e-78d7-4761-a3c9-cd6bfafce10b\") " Dec 03 11:31:45 crc kubenswrapper[4702]: I1203 11:31:45.990574 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db35177e-78d7-4761-a3c9-cd6bfafce10b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db35177e-78d7-4761-a3c9-cd6bfafce10b" (UID: "db35177e-78d7-4761-a3c9-cd6bfafce10b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:45.998972 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:45.999247 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerName="glance-log" containerID="cri-o://d163f2220739998c6faa435f3851d3ed49b8280d12a98ffc9374079f638ef985" gracePeriod=30 Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.000182 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"381fd572-826d-4e69-ad36-f90b539f21ab","Type":"ContainerStarted","Data":"84962eaf52e92004f49d61f5e4b77adde5654cfd769e8cb6a536a5d4449d9a31"} Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.000420 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerName="glance-httpd" containerID="cri-o://6f19a085bbe53ccb297c996ab5745a0fe8db1aa865b3b0612be0c34715db636e" gracePeriod=30 Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.047720 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db35177e-78d7-4761-a3c9-cd6bfafce10b-kube-api-access-f59rh" (OuterVolumeSpecName: "kube-api-access-f59rh") pod "db35177e-78d7-4761-a3c9-cd6bfafce10b" (UID: "db35177e-78d7-4761-a3c9-cd6bfafce10b"). InnerVolumeSpecName "kube-api-access-f59rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.084593 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jt55p" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.086851 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jt55p" event={"ID":"db35177e-78d7-4761-a3c9-cd6bfafce10b","Type":"ContainerDied","Data":"d3c9b888696e6297348f0149041229a874bd2ac83972558dfdd75d26a342a80b"} Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.086926 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3c9b888696e6297348f0149041229a874bd2ac83972558dfdd75d26a342a80b" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.092983 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db35177e-78d7-4761-a3c9-cd6bfafce10b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.093026 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f59rh\" (UniqueName: \"kubernetes.io/projected/db35177e-78d7-4761-a3c9-cd6bfafce10b-kube-api-access-f59rh\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.406245 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.433097 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.512631 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2616bfea-114e-4447-bed8-719c64679287-operator-scripts\") pod \"2616bfea-114e-4447-bed8-719c64679287\" (UID: \"2616bfea-114e-4447-bed8-719c64679287\") " Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.513294 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5zpd\" (UniqueName: \"kubernetes.io/projected/2616bfea-114e-4447-bed8-719c64679287-kube-api-access-z5zpd\") pod \"2616bfea-114e-4447-bed8-719c64679287\" (UID: \"2616bfea-114e-4447-bed8-719c64679287\") " Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.516437 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2616bfea-114e-4447-bed8-719c64679287-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2616bfea-114e-4447-bed8-719c64679287" (UID: "2616bfea-114e-4447-bed8-719c64679287"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.551228 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2616bfea-114e-4447-bed8-719c64679287-kube-api-access-z5zpd" (OuterVolumeSpecName: "kube-api-access-z5zpd") pod "2616bfea-114e-4447-bed8-719c64679287" (UID: "2616bfea-114e-4447-bed8-719c64679287"). InnerVolumeSpecName "kube-api-access-z5zpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.619206 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmtsd\" (UniqueName: \"kubernetes.io/projected/46c8d995-9bc9-464f-b388-aebbef2024fd-kube-api-access-pmtsd\") pod \"46c8d995-9bc9-464f-b388-aebbef2024fd\" (UID: \"46c8d995-9bc9-464f-b388-aebbef2024fd\") " Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.619485 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c8d995-9bc9-464f-b388-aebbef2024fd-operator-scripts\") pod \"46c8d995-9bc9-464f-b388-aebbef2024fd\" (UID: \"46c8d995-9bc9-464f-b388-aebbef2024fd\") " Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.620326 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5zpd\" (UniqueName: \"kubernetes.io/projected/2616bfea-114e-4447-bed8-719c64679287-kube-api-access-z5zpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.620348 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2616bfea-114e-4447-bed8-719c64679287-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.634783 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46c8d995-9bc9-464f-b388-aebbef2024fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46c8d995-9bc9-464f-b388-aebbef2024fd" (UID: "46c8d995-9bc9-464f-b388-aebbef2024fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.642896 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c8d995-9bc9-464f-b388-aebbef2024fd-kube-api-access-pmtsd" (OuterVolumeSpecName: "kube-api-access-pmtsd") pod "46c8d995-9bc9-464f-b388-aebbef2024fd" (UID: "46c8d995-9bc9-464f-b388-aebbef2024fd"). InnerVolumeSpecName "kube-api-access-pmtsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.739147 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmtsd\" (UniqueName: \"kubernetes.io/projected/46c8d995-9bc9-464f-b388-aebbef2024fd-kube-api-access-pmtsd\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.739204 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46c8d995-9bc9-464f-b388-aebbef2024fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.851376 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:46 crc kubenswrapper[4702]: E1203 11:31:46.910854 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb35177e_78d7_4761_a3c9_cd6bfafce10b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ad54658_ece0_4731_a07c_3ba8cfb73693.slice/crio-conmon-d163f2220739998c6faa435f3851d3ed49b8280d12a98ffc9374079f638ef985.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.944813 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8rb4\" (UniqueName: \"kubernetes.io/projected/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-kube-api-access-l8rb4\") pod \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\" (UID: \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\") " Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.944994 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-operator-scripts\") pod \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\" (UID: \"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f\") " Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.947009 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f" (UID: "bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:31:46 crc kubenswrapper[4702]: I1203 11:31:46.958396 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-kube-api-access-l8rb4" (OuterVolumeSpecName: "kube-api-access-l8rb4") pod "bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f" (UID: "bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f"). InnerVolumeSpecName "kube-api-access-l8rb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.087970 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8rb4\" (UniqueName: \"kubernetes.io/projected/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-kube-api-access-l8rb4\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.088290 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.138008 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" event={"ID":"2616bfea-114e-4447-bed8-719c64679287","Type":"ContainerDied","Data":"54733c3cc7c04b18abfbd2adc0624a47539898bfb00030f5828fd21daf434f4e"} Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.138064 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54733c3cc7c04b18abfbd2adc0624a47539898bfb00030f5828fd21daf434f4e" Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.138186 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-bf50-account-create-update-8zqck" Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.151810 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-579b-account-create-update-28tr7" event={"ID":"bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f","Type":"ContainerDied","Data":"88c604167570120d4ba911ebcef24496a72634d6e908a9c564411deb0b36725c"} Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.151887 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88c604167570120d4ba911ebcef24496a72634d6e908a9c564411deb0b36725c" Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.151999 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-579b-account-create-update-28tr7" Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.174353 4702 generic.go:334] "Generic (PLEG): container finished" podID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerID="d163f2220739998c6faa435f3851d3ed49b8280d12a98ffc9374079f638ef985" exitCode=143 Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.174487 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ad54658-ece0-4731-a07c-3ba8cfb73693","Type":"ContainerDied","Data":"d163f2220739998c6faa435f3851d3ed49b8280d12a98ffc9374079f638ef985"} Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.182609 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerStarted","Data":"24498e10dbff824b24dc87479b6dcbdc2105e1f37f3015aaa76295eb800edab0"} Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.200382 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" event={"ID":"46c8d995-9bc9-464f-b388-aebbef2024fd","Type":"ContainerDied","Data":"5bc9c4f26e0e0ffb180afade22616694c282ed9c45e018aa00a28b8ca793be69"} Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.200472 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bc9c4f26e0e0ffb180afade22616694c282ed9c45e018aa00a28b8ca793be69" Dec 03 11:31:47 crc kubenswrapper[4702]: I1203 11:31:47.200680 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0687-account-create-update-zw8qt" Dec 03 11:31:48 crc kubenswrapper[4702]: I1203 11:31:48.223003 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"381fd572-826d-4e69-ad36-f90b539f21ab","Type":"ContainerStarted","Data":"da51e42ff4dc615398714b6360f3eef50ac1e66886a341dec232e2be2abcfe74"} Dec 03 11:31:48 crc kubenswrapper[4702]: I1203 11:31:48.223663 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"381fd572-826d-4e69-ad36-f90b539f21ab","Type":"ContainerStarted","Data":"1db99d86454aff9243dfa9a471728f64e4461e3fc2db8259966cd7b36a151814"} Dec 03 11:31:48 crc kubenswrapper[4702]: I1203 11:31:48.231881 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerStarted","Data":"14c14628f232955e873db6a71c3df7d74208bb721bb5fe4b85899b5b0b76e0d3"} Dec 03 11:31:48 crc kubenswrapper[4702]: I1203 11:31:48.233307 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:31:48 crc kubenswrapper[4702]: I1203 11:31:48.247618 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.247598109 podStartE2EDuration="5.247598109s" podCreationTimestamp="2025-12-03 11:31:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:48.242839973 +0000 UTC m=+1692.078768447" watchObservedRunningTime="2025-12-03 11:31:48.247598109 +0000 UTC m=+1692.083526573" Dec 03 11:31:48 crc kubenswrapper[4702]: I1203 11:31:48.289354 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.177832487 podStartE2EDuration="9.289327078s" podCreationTimestamp="2025-12-03 11:31:39 +0000 UTC" firstStartedPulling="2025-12-03 11:31:41.397355066 +0000 UTC m=+1685.233283530" lastFinishedPulling="2025-12-03 11:31:47.508849647 +0000 UTC m=+1691.344778121" observedRunningTime="2025-12-03 11:31:48.276888983 +0000 UTC m=+1692.112817447" watchObservedRunningTime="2025-12-03 11:31:48.289327078 +0000 UTC m=+1692.125255542" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.487520 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pn6bt"] Dec 03 11:31:49 crc kubenswrapper[4702]: E1203 11:31:49.488711 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2616bfea-114e-4447-bed8-719c64679287" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.488727 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2616bfea-114e-4447-bed8-719c64679287" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: E1203 11:31:49.488771 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.488778 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: E1203 11:31:49.488796 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db35177e-78d7-4761-a3c9-cd6bfafce10b" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.488803 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="db35177e-78d7-4761-a3c9-cd6bfafce10b" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: E1203 11:31:49.488833 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="610ca85f-6b9d-4275-852e-aa9424ce4066" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.488839 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="610ca85f-6b9d-4275-852e-aa9424ce4066" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: E1203 11:31:49.488855 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45a1ae25-65af-406f-8c00-4f7b228b2876" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.488861 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="45a1ae25-65af-406f-8c00-4f7b228b2876" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: E1203 11:31:49.488873 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c8d995-9bc9-464f-b388-aebbef2024fd" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.488879 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c8d995-9bc9-464f-b388-aebbef2024fd" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.489102 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="610ca85f-6b9d-4275-852e-aa9424ce4066" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.489115 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="db35177e-78d7-4761-a3c9-cd6bfafce10b" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.489136 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c8d995-9bc9-464f-b388-aebbef2024fd" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.489149 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.489160 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2616bfea-114e-4447-bed8-719c64679287" containerName="mariadb-account-create-update" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.489172 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="45a1ae25-65af-406f-8c00-4f7b228b2876" containerName="mariadb-database-create" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.490206 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.495565 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.495904 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.496102 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tjkpc" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.508656 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pn6bt"] Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.645364 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-config-data\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.645514 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww4cb\" (UniqueName: \"kubernetes.io/projected/0b22aedf-6076-4262-9607-2b26e09f77a0-kube-api-access-ww4cb\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.645556 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-scripts\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.645597 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.748060 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww4cb\" (UniqueName: \"kubernetes.io/projected/0b22aedf-6076-4262-9607-2b26e09f77a0-kube-api-access-ww4cb\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.748135 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-scripts\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.748178 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.748292 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-config-data\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.754175 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.756565 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-config-data\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.762291 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-scripts\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.767119 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww4cb\" (UniqueName: \"kubernetes.io/projected/0b22aedf-6076-4262-9607-2b26e09f77a0-kube-api-access-ww4cb\") pod \"nova-cell0-conductor-db-sync-pn6bt\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:49 crc kubenswrapper[4702]: I1203 11:31:49.819621 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:31:50 crc kubenswrapper[4702]: I1203 11:31:50.346082 4702 generic.go:334] "Generic (PLEG): container finished" podID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerID="6f19a085bbe53ccb297c996ab5745a0fe8db1aa865b3b0612be0c34715db636e" exitCode=0 Dec 03 11:31:50 crc kubenswrapper[4702]: I1203 11:31:50.347601 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ad54658-ece0-4731-a07c-3ba8cfb73693","Type":"ContainerDied","Data":"6f19a085bbe53ccb297c996ab5745a0fe8db1aa865b3b0612be0c34715db636e"} Dec 03 11:31:50 crc kubenswrapper[4702]: I1203 11:31:50.678454 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pn6bt"] Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.191362 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.261770 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5lmx\" (UniqueName: \"kubernetes.io/projected/3ad54658-ece0-4731-a07c-3ba8cfb73693-kube-api-access-n5lmx\") pod \"3ad54658-ece0-4731-a07c-3ba8cfb73693\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.261890 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-httpd-run\") pod \"3ad54658-ece0-4731-a07c-3ba8cfb73693\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.261992 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-scripts\") pod \"3ad54658-ece0-4731-a07c-3ba8cfb73693\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.262065 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-logs\") pod \"3ad54658-ece0-4731-a07c-3ba8cfb73693\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.262122 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3ad54658-ece0-4731-a07c-3ba8cfb73693\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.262156 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-combined-ca-bundle\") pod \"3ad54658-ece0-4731-a07c-3ba8cfb73693\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.262205 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-config-data\") pod \"3ad54658-ece0-4731-a07c-3ba8cfb73693\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.262294 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-public-tls-certs\") pod \"3ad54658-ece0-4731-a07c-3ba8cfb73693\" (UID: \"3ad54658-ece0-4731-a07c-3ba8cfb73693\") " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.263733 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-logs" (OuterVolumeSpecName: "logs") pod "3ad54658-ece0-4731-a07c-3ba8cfb73693" (UID: "3ad54658-ece0-4731-a07c-3ba8cfb73693"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.264548 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3ad54658-ece0-4731-a07c-3ba8cfb73693" (UID: "3ad54658-ece0-4731-a07c-3ba8cfb73693"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.277804 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "3ad54658-ece0-4731-a07c-3ba8cfb73693" (UID: "3ad54658-ece0-4731-a07c-3ba8cfb73693"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.278204 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-scripts" (OuterVolumeSpecName: "scripts") pod "3ad54658-ece0-4731-a07c-3ba8cfb73693" (UID: "3ad54658-ece0-4731-a07c-3ba8cfb73693"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.295995 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad54658-ece0-4731-a07c-3ba8cfb73693-kube-api-access-n5lmx" (OuterVolumeSpecName: "kube-api-access-n5lmx") pod "3ad54658-ece0-4731-a07c-3ba8cfb73693" (UID: "3ad54658-ece0-4731-a07c-3ba8cfb73693"). InnerVolumeSpecName "kube-api-access-n5lmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.341847 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ad54658-ece0-4731-a07c-3ba8cfb73693" (UID: "3ad54658-ece0-4731-a07c-3ba8cfb73693"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.366619 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5lmx\" (UniqueName: \"kubernetes.io/projected/3ad54658-ece0-4731-a07c-3ba8cfb73693-kube-api-access-n5lmx\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.366861 4702 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.366942 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.367003 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ad54658-ece0-4731-a07c-3ba8cfb73693-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.367136 4702 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.367300 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.388889 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3ad54658-ece0-4731-a07c-3ba8cfb73693","Type":"ContainerDied","Data":"929fe1519dd4b9565952142c3eec16200f3f06878e30d4a85dfcf1ecf8008ca1"} Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.388951 4702 scope.go:117] "RemoveContainer" containerID="6f19a085bbe53ccb297c996ab5745a0fe8db1aa865b3b0612be0c34715db636e" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.389106 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.410645 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-config-data" (OuterVolumeSpecName: "config-data") pod "3ad54658-ece0-4731-a07c-3ba8cfb73693" (UID: "3ad54658-ece0-4731-a07c-3ba8cfb73693"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.411479 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3ad54658-ece0-4731-a07c-3ba8cfb73693" (UID: "3ad54658-ece0-4731-a07c-3ba8cfb73693"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.414392 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" event={"ID":"0b22aedf-6076-4262-9607-2b26e09f77a0","Type":"ContainerStarted","Data":"b590e125d26729f78258e79dde9ba35cd49d8b78ef36fc4d6fa6fb240c7becd7"} Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.428414 4702 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.585651 4702 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.586749 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.586799 4702 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3ad54658-ece0-4731-a07c-3ba8cfb73693-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.604819 4702 scope.go:117] "RemoveContainer" containerID="d163f2220739998c6faa435f3851d3ed49b8280d12a98ffc9374079f638ef985" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.822599 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.853307 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.878826 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:31:51 crc kubenswrapper[4702]: E1203 11:31:51.880303 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerName="glance-log" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.880335 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerName="glance-log" Dec 03 11:31:51 crc kubenswrapper[4702]: E1203 11:31:51.880410 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerName="glance-httpd" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.880422 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerName="glance-httpd" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.881352 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerName="glance-log" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.881390 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" containerName="glance-httpd" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.887047 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.894245 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.895893 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 11:31:51 crc kubenswrapper[4702]: I1203 11:31:51.913103 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.010496 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.010572 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.012363 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xsw\" (UniqueName: \"kubernetes.io/projected/eafb11cf-a4c9-4744-822d-6ccefe624f89-kube-api-access-p5xsw\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.012498 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafb11cf-a4c9-4744-822d-6ccefe624f89-logs\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.012910 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.012947 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eafb11cf-a4c9-4744-822d-6ccefe624f89-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.012977 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-config-data\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.013068 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-scripts\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.115821 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.115877 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eafb11cf-a4c9-4744-822d-6ccefe624f89-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.115909 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-config-data\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.116020 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-scripts\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.116113 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.116186 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.116261 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.117195 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/eafb11cf-a4c9-4744-822d-6ccefe624f89-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.117498 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xsw\" (UniqueName: \"kubernetes.io/projected/eafb11cf-a4c9-4744-822d-6ccefe624f89-kube-api-access-p5xsw\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.117609 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafb11cf-a4c9-4744-822d-6ccefe624f89-logs\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.118099 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eafb11cf-a4c9-4744-822d-6ccefe624f89-logs\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.141227 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.184838 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-config-data\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.192770 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-scripts\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.207075 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eafb11cf-a4c9-4744-822d-6ccefe624f89-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.212103 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xsw\" (UniqueName: \"kubernetes.io/projected/eafb11cf-a4c9-4744-822d-6ccefe624f89-kube-api-access-p5xsw\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.243338 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"eafb11cf-a4c9-4744-822d-6ccefe624f89\") " pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.536288 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 11:31:52 crc kubenswrapper[4702]: I1203 11:31:52.991647 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad54658-ece0-4731-a07c-3ba8cfb73693" path="/var/lib/kubelet/pods/3ad54658-ece0-4731-a07c-3ba8cfb73693/volumes" Dec 03 11:31:53 crc kubenswrapper[4702]: I1203 11:31:53.207608 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 11:31:53 crc kubenswrapper[4702]: W1203 11:31:53.224581 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeafb11cf_a4c9_4744_822d_6ccefe624f89.slice/crio-41ab03af33bd14b57952dc21ce0845cd28c78dc22c140e0b408b78ea4f7766d0 WatchSource:0}: Error finding container 41ab03af33bd14b57952dc21ce0845cd28c78dc22c140e0b408b78ea4f7766d0: Status 404 returned error can't find the container with id 41ab03af33bd14b57952dc21ce0845cd28c78dc22c140e0b408b78ea4f7766d0 Dec 03 11:31:53 crc kubenswrapper[4702]: I1203 11:31:53.500173 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eafb11cf-a4c9-4744-822d-6ccefe624f89","Type":"ContainerStarted","Data":"41ab03af33bd14b57952dc21ce0845cd28c78dc22c140e0b408b78ea4f7766d0"} Dec 03 11:31:54 crc kubenswrapper[4702]: I1203 11:31:54.250269 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:54 crc kubenswrapper[4702]: I1203 11:31:54.259559 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:54 crc kubenswrapper[4702]: I1203 11:31:54.323900 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:54 crc kubenswrapper[4702]: I1203 11:31:54.323963 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:54 crc kubenswrapper[4702]: I1203 11:31:54.524860 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eafb11cf-a4c9-4744-822d-6ccefe624f89","Type":"ContainerStarted","Data":"d4436c7cbfa9aa73f074b572a049a29f9c031926be6eb93a78956b0b6f714815"} Dec 03 11:31:54 crc kubenswrapper[4702]: I1203 11:31:54.524925 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:54 crc kubenswrapper[4702]: I1203 11:31:54.525075 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:55 crc kubenswrapper[4702]: I1203 11:31:55.562440 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"eafb11cf-a4c9-4744-822d-6ccefe624f89","Type":"ContainerStarted","Data":"596cb61de98cb74e1cea184b720ad9b18a7021d253d41633693a23d7a1c5c9b2"} Dec 03 11:31:55 crc kubenswrapper[4702]: I1203 11:31:55.610086 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.610055542 podStartE2EDuration="4.610055542s" podCreationTimestamp="2025-12-03 11:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:31:55.601694294 +0000 UTC m=+1699.437622778" watchObservedRunningTime="2025-12-03 11:31:55.610055542 +0000 UTC m=+1699.445984006" Dec 03 11:31:55 crc kubenswrapper[4702]: I1203 11:31:55.910637 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:31:55 crc kubenswrapper[4702]: I1203 11:31:55.911278 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:31:56 crc kubenswrapper[4702]: I1203 11:31:56.573044 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:31:56 crc kubenswrapper[4702]: I1203 11:31:56.573081 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:31:58 crc kubenswrapper[4702]: I1203 11:31:58.534550 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 11:31:58 crc kubenswrapper[4702]: I1203 11:31:58.535447 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:31:58 crc kubenswrapper[4702]: I1203 11:31:58.601723 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 11:32:01 crc kubenswrapper[4702]: I1203 11:32:01.155374 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:01 crc kubenswrapper[4702]: I1203 11:32:01.156609 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="ceilometer-central-agent" containerID="cri-o://2c24a2fd52bab6bfe68a8534d54940ead8a1fdc2390be608c0d6cf45ab595657" gracePeriod=30 Dec 03 11:32:01 crc kubenswrapper[4702]: I1203 11:32:01.157603 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="proxy-httpd" containerID="cri-o://14c14628f232955e873db6a71c3df7d74208bb721bb5fe4b85899b5b0b76e0d3" gracePeriod=30 Dec 03 11:32:01 crc kubenswrapper[4702]: I1203 11:32:01.157659 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="ceilometer-notification-agent" containerID="cri-o://ac829dd26e46bb8b73f2632aa8f67ebca3cf41405a73c6c61b4b44ad8d9cc5a0" gracePeriod=30 Dec 03 11:32:01 crc kubenswrapper[4702]: I1203 11:32:01.157638 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="sg-core" containerID="cri-o://24498e10dbff824b24dc87479b6dcbdc2105e1f37f3015aaa76295eb800edab0" gracePeriod=30 Dec 03 11:32:01 crc kubenswrapper[4702]: I1203 11:32:01.170258 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.227:3000/\": EOF" Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.537154 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.537562 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.627312 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.634539 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.715265 4702 generic.go:334] "Generic (PLEG): container finished" podID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerID="14c14628f232955e873db6a71c3df7d74208bb721bb5fe4b85899b5b0b76e0d3" exitCode=0 Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.715315 4702 generic.go:334] "Generic (PLEG): container finished" podID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerID="24498e10dbff824b24dc87479b6dcbdc2105e1f37f3015aaa76295eb800edab0" exitCode=2 Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.715326 4702 generic.go:334] "Generic (PLEG): container finished" podID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerID="ac829dd26e46bb8b73f2632aa8f67ebca3cf41405a73c6c61b4b44ad8d9cc5a0" exitCode=0 Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.715334 4702 generic.go:334] "Generic (PLEG): container finished" podID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerID="2c24a2fd52bab6bfe68a8534d54940ead8a1fdc2390be608c0d6cf45ab595657" exitCode=0 Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.717273 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerDied","Data":"14c14628f232955e873db6a71c3df7d74208bb721bb5fe4b85899b5b0b76e0d3"} Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.717325 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.717340 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerDied","Data":"24498e10dbff824b24dc87479b6dcbdc2105e1f37f3015aaa76295eb800edab0"} Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.717351 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerDied","Data":"ac829dd26e46bb8b73f2632aa8f67ebca3cf41405a73c6c61b4b44ad8d9cc5a0"} Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.717364 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerDied","Data":"2c24a2fd52bab6bfe68a8534d54940ead8a1fdc2390be608c0d6cf45ab595657"} Dec 03 11:32:02 crc kubenswrapper[4702]: I1203 11:32:02.717487 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 11:32:04 crc kubenswrapper[4702]: I1203 11:32:04.749242 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:32:04 crc kubenswrapper[4702]: I1203 11:32:04.749563 4702 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 11:32:05 crc kubenswrapper[4702]: I1203 11:32:05.439382 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 11:32:05 crc kubenswrapper[4702]: I1203 11:32:05.676357 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 11:32:08 crc kubenswrapper[4702]: E1203 11:32:08.221283 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Dec 03 11:32:08 crc kubenswrapper[4702]: E1203 11:32:08.222156 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ww4cb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-pn6bt_openstack(0b22aedf-6076-4262-9607-2b26e09f77a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:32:08 crc kubenswrapper[4702]: E1203 11:32:08.223443 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" podUID="0b22aedf-6076-4262-9607-2b26e09f77a0" Dec 03 11:32:08 crc kubenswrapper[4702]: E1203 11:32:08.861402 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" podUID="0b22aedf-6076-4262-9607-2b26e09f77a0" Dec 03 11:32:08 crc kubenswrapper[4702]: I1203 11:32:08.973009 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.002841 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-log-httpd\") pod \"9baeea95-5794-4b3a-b239-580dc8391f6d\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.003407 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-scripts\") pod \"9baeea95-5794-4b3a-b239-580dc8391f6d\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.003486 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-run-httpd\") pod \"9baeea95-5794-4b3a-b239-580dc8391f6d\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.003538 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqxdp\" (UniqueName: \"kubernetes.io/projected/9baeea95-5794-4b3a-b239-580dc8391f6d-kube-api-access-nqxdp\") pod \"9baeea95-5794-4b3a-b239-580dc8391f6d\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.003573 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-combined-ca-bundle\") pod \"9baeea95-5794-4b3a-b239-580dc8391f6d\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.003601 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-config-data\") pod \"9baeea95-5794-4b3a-b239-580dc8391f6d\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.003663 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-sg-core-conf-yaml\") pod \"9baeea95-5794-4b3a-b239-580dc8391f6d\" (UID: \"9baeea95-5794-4b3a-b239-580dc8391f6d\") " Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.004095 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9baeea95-5794-4b3a-b239-580dc8391f6d" (UID: "9baeea95-5794-4b3a-b239-580dc8391f6d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.005034 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.011716 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9baeea95-5794-4b3a-b239-580dc8391f6d" (UID: "9baeea95-5794-4b3a-b239-580dc8391f6d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.029253 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9baeea95-5794-4b3a-b239-580dc8391f6d-kube-api-access-nqxdp" (OuterVolumeSpecName: "kube-api-access-nqxdp") pod "9baeea95-5794-4b3a-b239-580dc8391f6d" (UID: "9baeea95-5794-4b3a-b239-580dc8391f6d"). InnerVolumeSpecName "kube-api-access-nqxdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.035969 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-scripts" (OuterVolumeSpecName: "scripts") pod "9baeea95-5794-4b3a-b239-580dc8391f6d" (UID: "9baeea95-5794-4b3a-b239-580dc8391f6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.108201 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqxdp\" (UniqueName: \"kubernetes.io/projected/9baeea95-5794-4b3a-b239-580dc8391f6d-kube-api-access-nqxdp\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.108253 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.108265 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9baeea95-5794-4b3a-b239-580dc8391f6d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.117482 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9baeea95-5794-4b3a-b239-580dc8391f6d" (UID: "9baeea95-5794-4b3a-b239-580dc8391f6d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.186970 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9baeea95-5794-4b3a-b239-580dc8391f6d" (UID: "9baeea95-5794-4b3a-b239-580dc8391f6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.194329 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-config-data" (OuterVolumeSpecName: "config-data") pod "9baeea95-5794-4b3a-b239-580dc8391f6d" (UID: "9baeea95-5794-4b3a-b239-580dc8391f6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.210545 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.210875 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.210960 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9baeea95-5794-4b3a-b239-580dc8391f6d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.874193 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9baeea95-5794-4b3a-b239-580dc8391f6d","Type":"ContainerDied","Data":"716e12d4ff7fa429dd23fc15e91cbf271570f54c07bf0b39102d5695788ffaf7"} Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.874263 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.874496 4702 scope.go:117] "RemoveContainer" containerID="14c14628f232955e873db6a71c3df7d74208bb721bb5fe4b85899b5b0b76e0d3" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.938199 4702 scope.go:117] "RemoveContainer" containerID="24498e10dbff824b24dc87479b6dcbdc2105e1f37f3015aaa76295eb800edab0" Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.949211 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:09 crc kubenswrapper[4702]: I1203 11:32:09.995786 4702 scope.go:117] "RemoveContainer" containerID="ac829dd26e46bb8b73f2632aa8f67ebca3cf41405a73c6c61b4b44ad8d9cc5a0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.003650 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.017587 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:10 crc kubenswrapper[4702]: E1203 11:32:10.018453 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="ceilometer-central-agent" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.018490 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="ceilometer-central-agent" Dec 03 11:32:10 crc kubenswrapper[4702]: E1203 11:32:10.018511 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="ceilometer-notification-agent" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.018520 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="ceilometer-notification-agent" Dec 03 11:32:10 crc kubenswrapper[4702]: E1203 11:32:10.018541 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="proxy-httpd" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.018548 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="proxy-httpd" Dec 03 11:32:10 crc kubenswrapper[4702]: E1203 11:32:10.018570 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="sg-core" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.018577 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="sg-core" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.018957 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="ceilometer-notification-agent" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.019004 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="sg-core" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.019021 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="ceilometer-central-agent" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.019032 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" containerName="proxy-httpd" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.022898 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.026579 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.026882 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.033470 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.047112 4702 scope.go:117] "RemoveContainer" containerID="2c24a2fd52bab6bfe68a8534d54940ead8a1fdc2390be608c0d6cf45ab595657" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.054511 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2xmz\" (UniqueName: \"kubernetes.io/projected/2faeaa5e-c3df-4281-a873-8a095bd1293e-kube-api-access-n2xmz\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.054691 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-scripts\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.054793 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-config-data\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.054852 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.054978 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-log-httpd\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.055040 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-run-httpd\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.055165 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.163015 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2xmz\" (UniqueName: \"kubernetes.io/projected/2faeaa5e-c3df-4281-a873-8a095bd1293e-kube-api-access-n2xmz\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.163174 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-scripts\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.163223 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-config-data\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.163275 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.163423 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-log-httpd\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.163460 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-run-httpd\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.163591 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.164415 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-log-httpd\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.165486 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-run-httpd\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.177949 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.178043 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.178386 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-scripts\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.178654 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-config-data\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.184919 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2xmz\" (UniqueName: \"kubernetes.io/projected/2faeaa5e-c3df-4281-a873-8a095bd1293e-kube-api-access-n2xmz\") pod \"ceilometer-0\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.357970 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:32:10 crc kubenswrapper[4702]: I1203 11:32:10.943341 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9baeea95-5794-4b3a-b239-580dc8391f6d" path="/var/lib/kubelet/pods/9baeea95-5794-4b3a-b239-580dc8391f6d/volumes" Dec 03 11:32:11 crc kubenswrapper[4702]: W1203 11:32:11.020694 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2faeaa5e_c3df_4281_a873_8a095bd1293e.slice/crio-4ad1abe3c59ea0fe06b250c566198c41ca716302eb8b594da47800846c2745d8 WatchSource:0}: Error finding container 4ad1abe3c59ea0fe06b250c566198c41ca716302eb8b594da47800846c2745d8: Status 404 returned error can't find the container with id 4ad1abe3c59ea0fe06b250c566198c41ca716302eb8b594da47800846c2745d8 Dec 03 11:32:11 crc kubenswrapper[4702]: I1203 11:32:11.023275 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:12 crc kubenswrapper[4702]: I1203 11:32:12.160863 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerStarted","Data":"856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49"} Dec 03 11:32:12 crc kubenswrapper[4702]: I1203 11:32:12.161200 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerStarted","Data":"4ad1abe3c59ea0fe06b250c566198c41ca716302eb8b594da47800846c2745d8"} Dec 03 11:32:13 crc kubenswrapper[4702]: I1203 11:32:13.200603 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerStarted","Data":"6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5"} Dec 03 11:32:14 crc kubenswrapper[4702]: I1203 11:32:14.222529 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerStarted","Data":"45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd"} Dec 03 11:32:16 crc kubenswrapper[4702]: I1203 11:32:16.266855 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerStarted","Data":"549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83"} Dec 03 11:32:16 crc kubenswrapper[4702]: I1203 11:32:16.267596 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:32:16 crc kubenswrapper[4702]: I1203 11:32:16.307175 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.146584676 podStartE2EDuration="7.307143611s" podCreationTimestamp="2025-12-03 11:32:09 +0000 UTC" firstStartedPulling="2025-12-03 11:32:11.023803796 +0000 UTC m=+1714.859732260" lastFinishedPulling="2025-12-03 11:32:15.184362731 +0000 UTC m=+1719.020291195" observedRunningTime="2025-12-03 11:32:16.295076827 +0000 UTC m=+1720.131005311" watchObservedRunningTime="2025-12-03 11:32:16.307143611 +0000 UTC m=+1720.143072075" Dec 03 11:32:22 crc kubenswrapper[4702]: I1203 11:32:22.351475 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" event={"ID":"0b22aedf-6076-4262-9607-2b26e09f77a0","Type":"ContainerStarted","Data":"79e75be8f30d2d00ac399bc3dde03ba15783529297fb25d86097e04ecd0f91d4"} Dec 03 11:32:22 crc kubenswrapper[4702]: I1203 11:32:22.370260 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" podStartSLOduration=2.607413805 podStartE2EDuration="33.370230794s" podCreationTimestamp="2025-12-03 11:31:49 +0000 UTC" firstStartedPulling="2025-12-03 11:31:50.692898867 +0000 UTC m=+1694.528827331" lastFinishedPulling="2025-12-03 11:32:21.455715866 +0000 UTC m=+1725.291644320" observedRunningTime="2025-12-03 11:32:22.365711695 +0000 UTC m=+1726.201640159" watchObservedRunningTime="2025-12-03 11:32:22.370230794 +0000 UTC m=+1726.206159278" Dec 03 11:32:25 crc kubenswrapper[4702]: I1203 11:32:25.907789 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:32:25 crc kubenswrapper[4702]: I1203 11:32:25.908384 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:32:25 crc kubenswrapper[4702]: I1203 11:32:25.908445 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:32:25 crc kubenswrapper[4702]: I1203 11:32:25.909477 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:32:25 crc kubenswrapper[4702]: I1203 11:32:25.909590 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" gracePeriod=600 Dec 03 11:32:26 crc kubenswrapper[4702]: I1203 11:32:26.404056 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" exitCode=0 Dec 03 11:32:26 crc kubenswrapper[4702]: I1203 11:32:26.404123 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03"} Dec 03 11:32:26 crc kubenswrapper[4702]: I1203 11:32:26.404175 4702 scope.go:117] "RemoveContainer" containerID="51258198d84bd78a94c0b0549a633061f0278b4c24ceaa27b8c81a77f3277a36" Dec 03 11:32:27 crc kubenswrapper[4702]: E1203 11:32:27.137886 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:32:27 crc kubenswrapper[4702]: I1203 11:32:27.419885 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:32:27 crc kubenswrapper[4702]: E1203 11:32:27.420301 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:32:29 crc kubenswrapper[4702]: I1203 11:32:29.051570 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:29 crc kubenswrapper[4702]: I1203 11:32:29.052503 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="ceilometer-central-agent" containerID="cri-o://856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49" gracePeriod=30 Dec 03 11:32:29 crc kubenswrapper[4702]: I1203 11:32:29.052659 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="proxy-httpd" containerID="cri-o://549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83" gracePeriod=30 Dec 03 11:32:29 crc kubenswrapper[4702]: I1203 11:32:29.052707 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="sg-core" containerID="cri-o://45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd" gracePeriod=30 Dec 03 11:32:29 crc kubenswrapper[4702]: I1203 11:32:29.052736 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="ceilometer-notification-agent" containerID="cri-o://6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5" gracePeriod=30 Dec 03 11:32:29 crc kubenswrapper[4702]: I1203 11:32:29.069153 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.231:3000/\": EOF" Dec 03 11:32:29 crc kubenswrapper[4702]: I1203 11:32:29.454956 4702 generic.go:334] "Generic (PLEG): container finished" podID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerID="45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd" exitCode=2 Dec 03 11:32:29 crc kubenswrapper[4702]: I1203 11:32:29.455009 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerDied","Data":"45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd"} Dec 03 11:32:30 crc kubenswrapper[4702]: I1203 11:32:30.639502 4702 generic.go:334] "Generic (PLEG): container finished" podID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerID="549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83" exitCode=0 Dec 03 11:32:30 crc kubenswrapper[4702]: I1203 11:32:30.640990 4702 generic.go:334] "Generic (PLEG): container finished" podID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerID="856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49" exitCode=0 Dec 03 11:32:30 crc kubenswrapper[4702]: I1203 11:32:30.641653 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerDied","Data":"549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83"} Dec 03 11:32:30 crc kubenswrapper[4702]: I1203 11:32:30.641871 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerDied","Data":"856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49"} Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.343315 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-rl959"] Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.346442 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rl959" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.366258 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-rl959"] Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.461202 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-5b4f-account-create-update-x275r"] Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.463188 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.465482 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.479202 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-5b4f-account-create-update-x275r"] Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.529863 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7tlk\" (UniqueName: \"kubernetes.io/projected/20ad57fb-2b09-47ee-9352-989843fd2b29-kube-api-access-d7tlk\") pod \"aodh-db-create-rl959\" (UID: \"20ad57fb-2b09-47ee-9352-989843fd2b29\") " pod="openstack/aodh-db-create-rl959" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.529951 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ad57fb-2b09-47ee-9352-989843fd2b29-operator-scripts\") pod \"aodh-db-create-rl959\" (UID: \"20ad57fb-2b09-47ee-9352-989843fd2b29\") " pod="openstack/aodh-db-create-rl959" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.542024 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.631841 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-sg-core-conf-yaml\") pod \"2faeaa5e-c3df-4281-a873-8a095bd1293e\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.631955 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-config-data\") pod \"2faeaa5e-c3df-4281-a873-8a095bd1293e\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632017 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2xmz\" (UniqueName: \"kubernetes.io/projected/2faeaa5e-c3df-4281-a873-8a095bd1293e-kube-api-access-n2xmz\") pod \"2faeaa5e-c3df-4281-a873-8a095bd1293e\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632061 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-combined-ca-bundle\") pod \"2faeaa5e-c3df-4281-a873-8a095bd1293e\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632224 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-log-httpd\") pod \"2faeaa5e-c3df-4281-a873-8a095bd1293e\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632275 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-run-httpd\") pod \"2faeaa5e-c3df-4281-a873-8a095bd1293e\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632293 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-scripts\") pod \"2faeaa5e-c3df-4281-a873-8a095bd1293e\" (UID: \"2faeaa5e-c3df-4281-a873-8a095bd1293e\") " Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632824 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18020584-2205-4e9e-a713-79af70a8a84b-operator-scripts\") pod \"aodh-5b4f-account-create-update-x275r\" (UID: \"18020584-2205-4e9e-a713-79af70a8a84b\") " pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632872 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw848\" (UniqueName: \"kubernetes.io/projected/18020584-2205-4e9e-a713-79af70a8a84b-kube-api-access-xw848\") pod \"aodh-5b4f-account-create-update-x275r\" (UID: \"18020584-2205-4e9e-a713-79af70a8a84b\") " pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632910 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7tlk\" (UniqueName: \"kubernetes.io/projected/20ad57fb-2b09-47ee-9352-989843fd2b29-kube-api-access-d7tlk\") pod \"aodh-db-create-rl959\" (UID: \"20ad57fb-2b09-47ee-9352-989843fd2b29\") " pod="openstack/aodh-db-create-rl959" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.632955 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ad57fb-2b09-47ee-9352-989843fd2b29-operator-scripts\") pod \"aodh-db-create-rl959\" (UID: \"20ad57fb-2b09-47ee-9352-989843fd2b29\") " pod="openstack/aodh-db-create-rl959" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.633827 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ad57fb-2b09-47ee-9352-989843fd2b29-operator-scripts\") pod \"aodh-db-create-rl959\" (UID: \"20ad57fb-2b09-47ee-9352-989843fd2b29\") " pod="openstack/aodh-db-create-rl959" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.639636 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2faeaa5e-c3df-4281-a873-8a095bd1293e" (UID: "2faeaa5e-c3df-4281-a873-8a095bd1293e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.639844 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2faeaa5e-c3df-4281-a873-8a095bd1293e" (UID: "2faeaa5e-c3df-4281-a873-8a095bd1293e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.654679 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faeaa5e-c3df-4281-a873-8a095bd1293e-kube-api-access-n2xmz" (OuterVolumeSpecName: "kube-api-access-n2xmz") pod "2faeaa5e-c3df-4281-a873-8a095bd1293e" (UID: "2faeaa5e-c3df-4281-a873-8a095bd1293e"). InnerVolumeSpecName "kube-api-access-n2xmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.657543 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-scripts" (OuterVolumeSpecName: "scripts") pod "2faeaa5e-c3df-4281-a873-8a095bd1293e" (UID: "2faeaa5e-c3df-4281-a873-8a095bd1293e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.659304 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7tlk\" (UniqueName: \"kubernetes.io/projected/20ad57fb-2b09-47ee-9352-989843fd2b29-kube-api-access-d7tlk\") pod \"aodh-db-create-rl959\" (UID: \"20ad57fb-2b09-47ee-9352-989843fd2b29\") " pod="openstack/aodh-db-create-rl959" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.695103 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2faeaa5e-c3df-4281-a873-8a095bd1293e" (UID: "2faeaa5e-c3df-4281-a873-8a095bd1293e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.701557 4702 generic.go:334] "Generic (PLEG): container finished" podID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerID="6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5" exitCode=0 Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.701617 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerDied","Data":"6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5"} Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.701791 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2faeaa5e-c3df-4281-a873-8a095bd1293e","Type":"ContainerDied","Data":"4ad1abe3c59ea0fe06b250c566198c41ca716302eb8b594da47800846c2745d8"} Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.701821 4702 scope.go:117] "RemoveContainer" containerID="549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.701682 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.735826 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18020584-2205-4e9e-a713-79af70a8a84b-operator-scripts\") pod \"aodh-5b4f-account-create-update-x275r\" (UID: \"18020584-2205-4e9e-a713-79af70a8a84b\") " pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.736730 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw848\" (UniqueName: \"kubernetes.io/projected/18020584-2205-4e9e-a713-79af70a8a84b-kube-api-access-xw848\") pod \"aodh-5b4f-account-create-update-x275r\" (UID: \"18020584-2205-4e9e-a713-79af70a8a84b\") " pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.736961 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.737044 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2faeaa5e-c3df-4281-a873-8a095bd1293e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.737112 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.737177 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.737272 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2xmz\" (UniqueName: \"kubernetes.io/projected/2faeaa5e-c3df-4281-a873-8a095bd1293e-kube-api-access-n2xmz\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.737228 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18020584-2205-4e9e-a713-79af70a8a84b-operator-scripts\") pod \"aodh-5b4f-account-create-update-x275r\" (UID: \"18020584-2205-4e9e-a713-79af70a8a84b\") " pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.786336 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw848\" (UniqueName: \"kubernetes.io/projected/18020584-2205-4e9e-a713-79af70a8a84b-kube-api-access-xw848\") pod \"aodh-5b4f-account-create-update-x275r\" (UID: \"18020584-2205-4e9e-a713-79af70a8a84b\") " pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.808482 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-config-data" (OuterVolumeSpecName: "config-data") pod "2faeaa5e-c3df-4281-a873-8a095bd1293e" (UID: "2faeaa5e-c3df-4281-a873-8a095bd1293e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.816864 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2faeaa5e-c3df-4281-a873-8a095bd1293e" (UID: "2faeaa5e-c3df-4281-a873-8a095bd1293e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.834166 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rl959" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.838450 4702 scope.go:117] "RemoveContainer" containerID="45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.840206 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.840255 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2faeaa5e-c3df-4281-a873-8a095bd1293e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.855108 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:34 crc kubenswrapper[4702]: I1203 11:32:34.915499 4702 scope.go:117] "RemoveContainer" containerID="6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.313368 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.315639 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.328451 4702 scope.go:117] "RemoveContainer" containerID="856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.328645 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:35 crc kubenswrapper[4702]: E1203 11:32:35.329366 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="proxy-httpd" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.329390 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="proxy-httpd" Dec 03 11:32:35 crc kubenswrapper[4702]: E1203 11:32:35.329418 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="ceilometer-notification-agent" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.329427 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="ceilometer-notification-agent" Dec 03 11:32:35 crc kubenswrapper[4702]: E1203 11:32:35.329478 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="sg-core" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.329489 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="sg-core" Dec 03 11:32:35 crc kubenswrapper[4702]: E1203 11:32:35.329508 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="ceilometer-central-agent" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.329516 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="ceilometer-central-agent" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.329865 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="sg-core" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.329894 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="ceilometer-notification-agent" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.329906 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="ceilometer-central-agent" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.329933 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" containerName="proxy-httpd" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.332413 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.335960 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.336165 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.358803 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.358852 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-scripts\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.358896 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-run-httpd\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.358934 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-config-data\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.359093 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvh8c\" (UniqueName: \"kubernetes.io/projected/fefa4548-587a-4fd5-b49a-727241aa7c24-kube-api-access-lvh8c\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.359146 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-log-httpd\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.359188 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.435546 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.461229 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvh8c\" (UniqueName: \"kubernetes.io/projected/fefa4548-587a-4fd5-b49a-727241aa7c24-kube-api-access-lvh8c\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.461486 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-log-httpd\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.461664 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.461908 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.462017 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-scripts\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.462124 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-run-httpd\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.462237 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-config-data\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.473338 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-log-httpd\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.478444 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.478590 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.479027 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-scripts\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.479403 4702 scope.go:117] "RemoveContainer" containerID="549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83" Dec 03 11:32:35 crc kubenswrapper[4702]: E1203 11:32:35.480370 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83\": container with ID starting with 549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83 not found: ID does not exist" containerID="549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.480419 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83"} err="failed to get container status \"549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83\": rpc error: code = NotFound desc = could not find container \"549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83\": container with ID starting with 549315d90fdd325154c8fe03c4533c06e22dcf6947f50114681c71deb56d3a83 not found: ID does not exist" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.480457 4702 scope.go:117] "RemoveContainer" containerID="45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.481048 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-run-httpd\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: E1203 11:32:35.482898 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd\": container with ID starting with 45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd not found: ID does not exist" containerID="45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.482935 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd"} err="failed to get container status \"45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd\": rpc error: code = NotFound desc = could not find container \"45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd\": container with ID starting with 45e831ea310c527cf6dc84884258110eadeccb2b6f385d72d878fcabf15aaedd not found: ID does not exist" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.482958 4702 scope.go:117] "RemoveContainer" containerID="6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.483719 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-config-data\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: E1203 11:32:35.489850 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5\": container with ID starting with 6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5 not found: ID does not exist" containerID="6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.489909 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5"} err="failed to get container status \"6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5\": rpc error: code = NotFound desc = could not find container \"6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5\": container with ID starting with 6fa39c50232f6a83d985d448358afdcc6a6af19ff7261017e61a5d61edf6dfd5 not found: ID does not exist" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.489941 4702 scope.go:117] "RemoveContainer" containerID="856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49" Dec 03 11:32:35 crc kubenswrapper[4702]: E1203 11:32:35.492161 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49\": container with ID starting with 856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49 not found: ID does not exist" containerID="856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.492224 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49"} err="failed to get container status \"856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49\": rpc error: code = NotFound desc = could not find container \"856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49\": container with ID starting with 856af2ba2a0f35a75554801b391ad115f72e2d4802fbed67d6b2dfbef7e1bf49 not found: ID does not exist" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.497926 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvh8c\" (UniqueName: \"kubernetes.io/projected/fefa4548-587a-4fd5-b49a-727241aa7c24-kube-api-access-lvh8c\") pod \"ceilometer-0\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " pod="openstack/ceilometer-0" Dec 03 11:32:35 crc kubenswrapper[4702]: I1203 11:32:35.772229 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.048926 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-5b4f-account-create-update-x275r"] Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.061038 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-rl959"] Dec 03 11:32:36 crc kubenswrapper[4702]: W1203 11:32:36.068566 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20ad57fb_2b09_47ee_9352_989843fd2b29.slice/crio-da61f594acc9ecdc4abd99b2d2d4a6e8b5c23334b30c965f565ed12b0dbea3b9 WatchSource:0}: Error finding container da61f594acc9ecdc4abd99b2d2d4a6e8b5c23334b30c965f565ed12b0dbea3b9: Status 404 returned error can't find the container with id da61f594acc9ecdc4abd99b2d2d4a6e8b5c23334b30c965f565ed12b0dbea3b9 Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.860044 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5b4f-account-create-update-x275r" event={"ID":"18020584-2205-4e9e-a713-79af70a8a84b","Type":"ContainerStarted","Data":"3b2d6d6c143376a96beaf16742b9ed198193bb858bb3383c6e0613a78072e8e9"} Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.860676 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5b4f-account-create-update-x275r" event={"ID":"18020584-2205-4e9e-a713-79af70a8a84b","Type":"ContainerStarted","Data":"e8a9d230dd3c15f039c4b34e6acc3fdb67256c4224a2c94c4fcbe6c4dda21a57"} Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.889973 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.892646 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rl959" event={"ID":"20ad57fb-2b09-47ee-9352-989843fd2b29","Type":"ContainerStarted","Data":"1d06b30e4b03a4b4198c18c1b3af601dd6906046baed9495a5e36caeb4640ef4"} Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.892698 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rl959" event={"ID":"20ad57fb-2b09-47ee-9352-989843fd2b29","Type":"ContainerStarted","Data":"da61f594acc9ecdc4abd99b2d2d4a6e8b5c23334b30c965f565ed12b0dbea3b9"} Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.905848 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-5b4f-account-create-update-x275r" podStartSLOduration=2.905823197 podStartE2EDuration="2.905823197s" podCreationTimestamp="2025-12-03 11:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:32:36.900395402 +0000 UTC m=+1740.736323866" watchObservedRunningTime="2025-12-03 11:32:36.905823197 +0000 UTC m=+1740.741751661" Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.935094 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-rl959" podStartSLOduration=2.93507029 podStartE2EDuration="2.93507029s" podCreationTimestamp="2025-12-03 11:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:32:36.915898404 +0000 UTC m=+1740.751826868" watchObservedRunningTime="2025-12-03 11:32:36.93507029 +0000 UTC m=+1740.770998754" Dec 03 11:32:36 crc kubenswrapper[4702]: I1203 11:32:36.965430 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2faeaa5e-c3df-4281-a873-8a095bd1293e" path="/var/lib/kubelet/pods/2faeaa5e-c3df-4281-a873-8a095bd1293e/volumes" Dec 03 11:32:37 crc kubenswrapper[4702]: I1203 11:32:37.907912 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5b4f-account-create-update-x275r" event={"ID":"18020584-2205-4e9e-a713-79af70a8a84b","Type":"ContainerDied","Data":"3b2d6d6c143376a96beaf16742b9ed198193bb858bb3383c6e0613a78072e8e9"} Dec 03 11:32:37 crc kubenswrapper[4702]: I1203 11:32:37.908124 4702 generic.go:334] "Generic (PLEG): container finished" podID="18020584-2205-4e9e-a713-79af70a8a84b" containerID="3b2d6d6c143376a96beaf16742b9ed198193bb858bb3383c6e0613a78072e8e9" exitCode=0 Dec 03 11:32:37 crc kubenswrapper[4702]: I1203 11:32:37.912018 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerStarted","Data":"da2cf936030d0d275bb26af563c87689810ee45bb27e315e27cb64f9485a5345"} Dec 03 11:32:37 crc kubenswrapper[4702]: I1203 11:32:37.912070 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerStarted","Data":"7966f66f7bed970227eb6827adefcb630768830d991cc02d7abbec344ac201e0"} Dec 03 11:32:37 crc kubenswrapper[4702]: I1203 11:32:37.914341 4702 generic.go:334] "Generic (PLEG): container finished" podID="20ad57fb-2b09-47ee-9352-989843fd2b29" containerID="1d06b30e4b03a4b4198c18c1b3af601dd6906046baed9495a5e36caeb4640ef4" exitCode=0 Dec 03 11:32:37 crc kubenswrapper[4702]: I1203 11:32:37.914395 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rl959" event={"ID":"20ad57fb-2b09-47ee-9352-989843fd2b29","Type":"ContainerDied","Data":"1d06b30e4b03a4b4198c18c1b3af601dd6906046baed9495a5e36caeb4640ef4"} Dec 03 11:32:38 crc kubenswrapper[4702]: I1203 11:32:38.948968 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerStarted","Data":"8ce8524b216fff9e24b3ce10dab7a1c01ee118311442090011ae2771ac664f4b"} Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.596080 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.736003 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18020584-2205-4e9e-a713-79af70a8a84b-operator-scripts\") pod \"18020584-2205-4e9e-a713-79af70a8a84b\" (UID: \"18020584-2205-4e9e-a713-79af70a8a84b\") " Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.736161 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xw848\" (UniqueName: \"kubernetes.io/projected/18020584-2205-4e9e-a713-79af70a8a84b-kube-api-access-xw848\") pod \"18020584-2205-4e9e-a713-79af70a8a84b\" (UID: \"18020584-2205-4e9e-a713-79af70a8a84b\") " Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.740487 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18020584-2205-4e9e-a713-79af70a8a84b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18020584-2205-4e9e-a713-79af70a8a84b" (UID: "18020584-2205-4e9e-a713-79af70a8a84b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.764103 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18020584-2205-4e9e-a713-79af70a8a84b-kube-api-access-xw848" (OuterVolumeSpecName: "kube-api-access-xw848") pod "18020584-2205-4e9e-a713-79af70a8a84b" (UID: "18020584-2205-4e9e-a713-79af70a8a84b"). InnerVolumeSpecName "kube-api-access-xw848". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.840210 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18020584-2205-4e9e-a713-79af70a8a84b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.840284 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xw848\" (UniqueName: \"kubernetes.io/projected/18020584-2205-4e9e-a713-79af70a8a84b-kube-api-access-xw848\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.928625 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:32:40 crc kubenswrapper[4702]: E1203 11:32:39.928963 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.951438 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-5b4f-account-create-update-x275r" event={"ID":"18020584-2205-4e9e-a713-79af70a8a84b","Type":"ContainerDied","Data":"e8a9d230dd3c15f039c4b34e6acc3fdb67256c4224a2c94c4fcbe6c4dda21a57"} Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.951483 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8a9d230dd3c15f039c4b34e6acc3fdb67256c4224a2c94c4fcbe6c4dda21a57" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.951536 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-5b4f-account-create-update-x275r" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:39.954073 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerStarted","Data":"0665a7640d7f6ac94d78200f0d1596a3c99e0d2eee6f6711b57d8d343371c74b"} Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:40.973023 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerStarted","Data":"dcf6785b83faf408fd535b940b429e547c1305962b06ff48f7d1da6665fca818"} Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:40.975102 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:40.977663 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-rl959" event={"ID":"20ad57fb-2b09-47ee-9352-989843fd2b29","Type":"ContainerDied","Data":"da61f594acc9ecdc4abd99b2d2d4a6e8b5c23334b30c965f565ed12b0dbea3b9"} Dec 03 11:32:40 crc kubenswrapper[4702]: I1203 11:32:40.977719 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da61f594acc9ecdc4abd99b2d2d4a6e8b5c23334b30c965f565ed12b0dbea3b9" Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.002137 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.363938205 podStartE2EDuration="6.00211352s" podCreationTimestamp="2025-12-03 11:32:35 +0000 UTC" firstStartedPulling="2025-12-03 11:32:36.906791204 +0000 UTC m=+1740.742719668" lastFinishedPulling="2025-12-03 11:32:40.544966519 +0000 UTC m=+1744.380894983" observedRunningTime="2025-12-03 11:32:40.994630087 +0000 UTC m=+1744.830558561" watchObservedRunningTime="2025-12-03 11:32:41.00211352 +0000 UTC m=+1744.838041984" Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.016557 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rl959" Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.204469 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ad57fb-2b09-47ee-9352-989843fd2b29-operator-scripts\") pod \"20ad57fb-2b09-47ee-9352-989843fd2b29\" (UID: \"20ad57fb-2b09-47ee-9352-989843fd2b29\") " Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.204933 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7tlk\" (UniqueName: \"kubernetes.io/projected/20ad57fb-2b09-47ee-9352-989843fd2b29-kube-api-access-d7tlk\") pod \"20ad57fb-2b09-47ee-9352-989843fd2b29\" (UID: \"20ad57fb-2b09-47ee-9352-989843fd2b29\") " Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.205173 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ad57fb-2b09-47ee-9352-989843fd2b29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20ad57fb-2b09-47ee-9352-989843fd2b29" (UID: "20ad57fb-2b09-47ee-9352-989843fd2b29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.205630 4702 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ad57fb-2b09-47ee-9352-989843fd2b29-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.212023 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ad57fb-2b09-47ee-9352-989843fd2b29-kube-api-access-d7tlk" (OuterVolumeSpecName: "kube-api-access-d7tlk") pod "20ad57fb-2b09-47ee-9352-989843fd2b29" (UID: "20ad57fb-2b09-47ee-9352-989843fd2b29"). InnerVolumeSpecName "kube-api-access-d7tlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.308333 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7tlk\" (UniqueName: \"kubernetes.io/projected/20ad57fb-2b09-47ee-9352-989843fd2b29-kube-api-access-d7tlk\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:41 crc kubenswrapper[4702]: I1203 11:32:41.990616 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-rl959" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.036528 4702 generic.go:334] "Generic (PLEG): container finished" podID="0b22aedf-6076-4262-9607-2b26e09f77a0" containerID="79e75be8f30d2d00ac399bc3dde03ba15783529297fb25d86097e04ecd0f91d4" exitCode=0 Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.036907 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" event={"ID":"0b22aedf-6076-4262-9607-2b26e09f77a0","Type":"ContainerDied","Data":"79e75be8f30d2d00ac399bc3dde03ba15783529297fb25d86097e04ecd0f91d4"} Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.910123 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-7k9st"] Dec 03 11:32:44 crc kubenswrapper[4702]: E1203 11:32:44.910784 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ad57fb-2b09-47ee-9352-989843fd2b29" containerName="mariadb-database-create" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.910805 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ad57fb-2b09-47ee-9352-989843fd2b29" containerName="mariadb-database-create" Dec 03 11:32:44 crc kubenswrapper[4702]: E1203 11:32:44.910828 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18020584-2205-4e9e-a713-79af70a8a84b" containerName="mariadb-account-create-update" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.910837 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="18020584-2205-4e9e-a713-79af70a8a84b" containerName="mariadb-account-create-update" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.911169 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="18020584-2205-4e9e-a713-79af70a8a84b" containerName="mariadb-account-create-update" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.911198 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ad57fb-2b09-47ee-9352-989843fd2b29" containerName="mariadb-database-create" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.912203 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.917436 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.917473 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-swrfb" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.917683 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.922307 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 11:32:44 crc kubenswrapper[4702]: I1203 11:32:44.958484 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-7k9st"] Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.072380 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-config-data\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.072960 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-scripts\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.073118 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjgtl\" (UniqueName: \"kubernetes.io/projected/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-kube-api-access-jjgtl\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.073484 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-combined-ca-bundle\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.175619 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-config-data\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.176079 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-scripts\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.176128 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjgtl\" (UniqueName: \"kubernetes.io/projected/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-kube-api-access-jjgtl\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.176183 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-combined-ca-bundle\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.193966 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-scripts\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.203712 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-combined-ca-bundle\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.204954 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-config-data\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.207600 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjgtl\" (UniqueName: \"kubernetes.io/projected/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-kube-api-access-jjgtl\") pod \"aodh-db-sync-7k9st\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.248307 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-7k9st" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.753646 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-7k9st"] Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.779036 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.798480 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-config-data\") pod \"0b22aedf-6076-4262-9607-2b26e09f77a0\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.798584 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-scripts\") pod \"0b22aedf-6076-4262-9607-2b26e09f77a0\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.798628 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-combined-ca-bundle\") pod \"0b22aedf-6076-4262-9607-2b26e09f77a0\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.798788 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww4cb\" (UniqueName: \"kubernetes.io/projected/0b22aedf-6076-4262-9607-2b26e09f77a0-kube-api-access-ww4cb\") pod \"0b22aedf-6076-4262-9607-2b26e09f77a0\" (UID: \"0b22aedf-6076-4262-9607-2b26e09f77a0\") " Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.812436 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b22aedf-6076-4262-9607-2b26e09f77a0-kube-api-access-ww4cb" (OuterVolumeSpecName: "kube-api-access-ww4cb") pod "0b22aedf-6076-4262-9607-2b26e09f77a0" (UID: "0b22aedf-6076-4262-9607-2b26e09f77a0"). InnerVolumeSpecName "kube-api-access-ww4cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.816047 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-scripts" (OuterVolumeSpecName: "scripts") pod "0b22aedf-6076-4262-9607-2b26e09f77a0" (UID: "0b22aedf-6076-4262-9607-2b26e09f77a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.865291 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b22aedf-6076-4262-9607-2b26e09f77a0" (UID: "0b22aedf-6076-4262-9607-2b26e09f77a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.893991 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-config-data" (OuterVolumeSpecName: "config-data") pod "0b22aedf-6076-4262-9607-2b26e09f77a0" (UID: "0b22aedf-6076-4262-9607-2b26e09f77a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.904490 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.904544 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.904556 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b22aedf-6076-4262-9607-2b26e09f77a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:45 crc kubenswrapper[4702]: I1203 11:32:45.904576 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww4cb\" (UniqueName: \"kubernetes.io/projected/0b22aedf-6076-4262-9607-2b26e09f77a0-kube-api-access-ww4cb\") on node \"crc\" DevicePath \"\"" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.073204 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-7k9st" event={"ID":"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd","Type":"ContainerStarted","Data":"a662e3b4014ee2a5a969ebdc377f948bdb99901b1115fabe164cb6563b92ba88"} Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.075083 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" event={"ID":"0b22aedf-6076-4262-9607-2b26e09f77a0","Type":"ContainerDied","Data":"b590e125d26729f78258e79dde9ba35cd49d8b78ef36fc4d6fa6fb240c7becd7"} Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.075140 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b590e125d26729f78258e79dde9ba35cd49d8b78ef36fc4d6fa6fb240c7becd7" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.075222 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-pn6bt" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.180419 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 11:32:46 crc kubenswrapper[4702]: E1203 11:32:46.180995 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b22aedf-6076-4262-9607-2b26e09f77a0" containerName="nova-cell0-conductor-db-sync" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.181013 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b22aedf-6076-4262-9607-2b26e09f77a0" containerName="nova-cell0-conductor-db-sync" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.181248 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b22aedf-6076-4262-9607-2b26e09f77a0" containerName="nova-cell0-conductor-db-sync" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.182133 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.196445 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.291385 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-tjkpc" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.291659 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.293658 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhxsg\" (UniqueName: \"kubernetes.io/projected/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-kube-api-access-fhxsg\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.294085 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.294311 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.397309 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.397448 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhxsg\" (UniqueName: \"kubernetes.io/projected/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-kube-api-access-fhxsg\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.397624 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.402977 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.403848 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.419875 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhxsg\" (UniqueName: \"kubernetes.io/projected/9f7dd418-9620-49e3-8eaf-6aa1d4a1434b-kube-api-access-fhxsg\") pod \"nova-cell0-conductor-0\" (UID: \"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b\") " pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:46 crc kubenswrapper[4702]: I1203 11:32:46.683520 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:47 crc kubenswrapper[4702]: I1203 11:32:47.261240 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 11:32:48 crc kubenswrapper[4702]: I1203 11:32:48.125220 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b","Type":"ContainerStarted","Data":"67b021a213f6bf0b8e625c40ce960f884cba82a3005dd6660b0cac3ae3a84759"} Dec 03 11:32:48 crc kubenswrapper[4702]: I1203 11:32:48.125731 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:48 crc kubenswrapper[4702]: I1203 11:32:48.125744 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9f7dd418-9620-49e3-8eaf-6aa1d4a1434b","Type":"ContainerStarted","Data":"c3c0cf449b9c3a93122931202d871228aadcc69c66c58dd55b11c2bead663ace"} Dec 03 11:32:48 crc kubenswrapper[4702]: I1203 11:32:48.148086 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.148064696 podStartE2EDuration="2.148064696s" podCreationTimestamp="2025-12-03 11:32:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:32:48.145587535 +0000 UTC m=+1751.981515999" watchObservedRunningTime="2025-12-03 11:32:48.148064696 +0000 UTC m=+1751.983993160" Dec 03 11:32:53 crc kubenswrapper[4702]: I1203 11:32:53.928890 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:32:53 crc kubenswrapper[4702]: E1203 11:32:53.929666 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:32:55 crc kubenswrapper[4702]: I1203 11:32:55.631883 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-7k9st" event={"ID":"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd","Type":"ContainerStarted","Data":"51de0a68602ee9718d493d4f0bcaae5a4171a08de5ab53b9bd4ce9779ff4c4e3"} Dec 03 11:32:55 crc kubenswrapper[4702]: I1203 11:32:55.662214 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-7k9st" podStartSLOduration=2.393342738 podStartE2EDuration="11.662183908s" podCreationTimestamp="2025-12-03 11:32:44 +0000 UTC" firstStartedPulling="2025-12-03 11:32:45.722210552 +0000 UTC m=+1749.558139016" lastFinishedPulling="2025-12-03 11:32:54.991051722 +0000 UTC m=+1758.826980186" observedRunningTime="2025-12-03 11:32:55.650226967 +0000 UTC m=+1759.486155421" watchObservedRunningTime="2025-12-03 11:32:55.662183908 +0000 UTC m=+1759.498112382" Dec 03 11:32:56 crc kubenswrapper[4702]: I1203 11:32:56.719048 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.319219 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-dq5nh"] Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.321825 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.332159 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.333169 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.336016 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dq5nh"] Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.495285 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-config-data\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.495423 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-scripts\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.495496 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh8rr\" (UniqueName: \"kubernetes.io/projected/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-kube-api-access-lh8rr\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.495571 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.598953 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-config-data\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.599119 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-scripts\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.599206 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh8rr\" (UniqueName: \"kubernetes.io/projected/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-kube-api-access-lh8rr\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.599315 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.616016 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.616310 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-scripts\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.617221 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-config-data\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.686555 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh8rr\" (UniqueName: \"kubernetes.io/projected/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-kube-api-access-lh8rr\") pod \"nova-cell0-cell-mapping-dq5nh\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.710351 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.766346 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.800260 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.821565 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.825563 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.829648 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.844791 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.880745 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.932859 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07f4cc75-4ffd-490a-914c-fb2b072db23b-logs\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.933040 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5h6\" (UniqueName: \"kubernetes.io/projected/07f4cc75-4ffd-490a-914c-fb2b072db23b-kube-api-access-nn5h6\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.933169 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-config-data\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.933308 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:57 crc kubenswrapper[4702]: I1203 11:32:57.982291 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.035676 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07f4cc75-4ffd-490a-914c-fb2b072db23b-logs\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.035805 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.035944 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5h6\" (UniqueName: \"kubernetes.io/projected/07f4cc75-4ffd-490a-914c-fb2b072db23b-kube-api-access-nn5h6\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.036210 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-config-data\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.036387 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-config-data\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.036417 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khhpn\" (UniqueName: \"kubernetes.io/projected/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-kube-api-access-khhpn\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.036612 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.037931 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07f4cc75-4ffd-490a-914c-fb2b072db23b-logs\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.053285 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-config-data\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.066457 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.106842 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5h6\" (UniqueName: \"kubernetes.io/projected/07f4cc75-4ffd-490a-914c-fb2b072db23b-kube-api-access-nn5h6\") pod \"nova-api-0\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.140852 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-config-data\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.141505 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khhpn\" (UniqueName: \"kubernetes.io/projected/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-kube-api-access-khhpn\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.141785 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.152270 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-config-data\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.153127 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.187265 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khhpn\" (UniqueName: \"kubernetes.io/projected/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-kube-api-access-khhpn\") pod \"nova-scheduler-0\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.212668 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.218846 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.233237 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.239599 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.349950 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.461042 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-config-data\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.462488 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069495e4-97b2-4505-879a-fdd8abd30cbc-logs\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.464440 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.464911 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tdmh\" (UniqueName: \"kubernetes.io/projected/069495e4-97b2-4505-879a-fdd8abd30cbc-kube-api-access-7tdmh\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.546331 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.621110 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.623224 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.636977 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.654724 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-config-data\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.655087 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069495e4-97b2-4505-879a-fdd8abd30cbc-logs\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.655352 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.655410 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tdmh\" (UniqueName: \"kubernetes.io/projected/069495e4-97b2-4505-879a-fdd8abd30cbc-kube-api-access-7tdmh\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.656450 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069495e4-97b2-4505-879a-fdd8abd30cbc-logs\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.672976 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-config-data\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.674473 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.681477 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tdmh\" (UniqueName: \"kubernetes.io/projected/069495e4-97b2-4505-879a-fdd8abd30cbc-kube-api-access-7tdmh\") pod \"nova-metadata-0\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.692530 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.783273 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.783630 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np2fl\" (UniqueName: \"kubernetes.io/projected/f9c35933-cc2c-459b-a47f-90397aa62811-kube-api-access-np2fl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.783846 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.784447 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4gpg6"] Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.839287 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4gpg6"] Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.839418 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.852272 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.889976 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np2fl\" (UniqueName: \"kubernetes.io/projected/f9c35933-cc2c-459b-a47f-90397aa62811-kube-api-access-np2fl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.890225 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.890375 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.903151 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.909843 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:58 crc kubenswrapper[4702]: I1203 11:32:58.940629 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np2fl\" (UniqueName: \"kubernetes.io/projected/f9c35933-cc2c-459b-a47f-90397aa62811-kube-api-access-np2fl\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.005790 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.006100 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.006151 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4zs\" (UniqueName: \"kubernetes.io/projected/45dcdd57-8d89-4094-a217-c6c58eeaba18-kube-api-access-vd4zs\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.006377 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-config\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.006406 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.006552 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.087447 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-dq5nh"] Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.163288 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.163519 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.163602 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.163624 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4zs\" (UniqueName: \"kubernetes.io/projected/45dcdd57-8d89-4094-a217-c6c58eeaba18-kube-api-access-vd4zs\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.163730 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-config\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.163776 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.166346 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-config\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.166373 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.166910 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.167092 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.167205 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-svc\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.189939 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.191365 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4zs\" (UniqueName: \"kubernetes.io/projected/45dcdd57-8d89-4094-a217-c6c58eeaba18-kube-api-access-vd4zs\") pod \"dnsmasq-dns-9b86998b5-4gpg6\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.454178 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.455083 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.725148 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.791770 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07f4cc75-4ffd-490a-914c-fb2b072db23b","Type":"ContainerStarted","Data":"7fcdad95685a95f1b1815d055ffdaf24f08d04184d6fbe4e54fb08f1e4aee9db"} Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.794294 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dq5nh" event={"ID":"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f","Type":"ContainerStarted","Data":"382162c87abd8e28697a3509b316313c53439cda66e41bbf8b631b0f466b2893"} Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.794464 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dq5nh" event={"ID":"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f","Type":"ContainerStarted","Data":"aa71b16534d50113416aea18dd7c68e7637af04fc2fa8d338ede4ffc41e2cfbd"} Dec 03 11:32:59 crc kubenswrapper[4702]: I1203 11:32:59.845978 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-dq5nh" podStartSLOduration=2.845944323 podStartE2EDuration="2.845944323s" podCreationTimestamp="2025-12-03 11:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:32:59.835375102 +0000 UTC m=+1763.671303566" watchObservedRunningTime="2025-12-03 11:32:59.845944323 +0000 UTC m=+1763.681872787" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.067707 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.453516 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.466729 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jzcdc"] Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.469654 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.483409 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.483612 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.557839 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jzcdc"] Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.670512 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-config-data\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.670735 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrrjh\" (UniqueName: \"kubernetes.io/projected/0099aaeb-f11f-492e-8962-e06a72b6878d-kube-api-access-jrrjh\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.670993 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.671062 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-scripts\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.749860 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4gpg6"] Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.776175 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrrjh\" (UniqueName: \"kubernetes.io/projected/0099aaeb-f11f-492e-8962-e06a72b6878d-kube-api-access-jrrjh\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.776329 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.776373 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-scripts\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.776488 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-config-data\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:00 crc kubenswrapper[4702]: I1203 11:33:00.921078 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-scripts\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:01 crc kubenswrapper[4702]: I1203 11:33:00.942254 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:01 crc kubenswrapper[4702]: I1203 11:33:01.053094 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrrjh\" (UniqueName: \"kubernetes.io/projected/0099aaeb-f11f-492e-8962-e06a72b6878d-kube-api-access-jrrjh\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:01 crc kubenswrapper[4702]: I1203 11:33:01.054903 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-config-data\") pod \"nova-cell1-conductor-db-sync-jzcdc\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:01 crc kubenswrapper[4702]: I1203 11:33:01.132655 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:01 crc kubenswrapper[4702]: I1203 11:33:01.444655 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9c35933-cc2c-459b-a47f-90397aa62811","Type":"ContainerStarted","Data":"0346a8993c1a8581c0cff99c353599482bc262307bf9c5588294ea1f8c7a97d2"} Dec 03 11:33:01 crc kubenswrapper[4702]: I1203 11:33:01.463887 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"069495e4-97b2-4505-879a-fdd8abd30cbc","Type":"ContainerStarted","Data":"d930b26fb085de9de6b9863c4b04fdcfbfe3f37a4c0c9b5c6a64d044b830bc5a"} Dec 03 11:33:01 crc kubenswrapper[4702]: I1203 11:33:01.463920 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea","Type":"ContainerStarted","Data":"7601771d58d1344c83e88df97dbf4a639e3cda7e97d2e9ec95d8213faa8a7033"} Dec 03 11:33:01 crc kubenswrapper[4702]: I1203 11:33:01.916705 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jzcdc"] Dec 03 11:33:01 crc kubenswrapper[4702]: W1203 11:33:01.946972 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0099aaeb_f11f_492e_8962_e06a72b6878d.slice/crio-473b65aeabcdea48fcf0f42b0bb20c3bb3dda68ca3828559d66f0d82a71c2bf4 WatchSource:0}: Error finding container 473b65aeabcdea48fcf0f42b0bb20c3bb3dda68ca3828559d66f0d82a71c2bf4: Status 404 returned error can't find the container with id 473b65aeabcdea48fcf0f42b0bb20c3bb3dda68ca3828559d66f0d82a71c2bf4 Dec 03 11:33:02 crc kubenswrapper[4702]: I1203 11:33:02.115605 4702 generic.go:334] "Generic (PLEG): container finished" podID="4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" containerID="51de0a68602ee9718d493d4f0bcaae5a4171a08de5ab53b9bd4ce9779ff4c4e3" exitCode=0 Dec 03 11:33:02 crc kubenswrapper[4702]: I1203 11:33:02.115981 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-7k9st" event={"ID":"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd","Type":"ContainerDied","Data":"51de0a68602ee9718d493d4f0bcaae5a4171a08de5ab53b9bd4ce9779ff4c4e3"} Dec 03 11:33:02 crc kubenswrapper[4702]: I1203 11:33:02.119777 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" event={"ID":"0099aaeb-f11f-492e-8962-e06a72b6878d","Type":"ContainerStarted","Data":"473b65aeabcdea48fcf0f42b0bb20c3bb3dda68ca3828559d66f0d82a71c2bf4"} Dec 03 11:33:02 crc kubenswrapper[4702]: I1203 11:33:02.128605 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" event={"ID":"45dcdd57-8d89-4094-a217-c6c58eeaba18","Type":"ContainerStarted","Data":"3502f437825d4954b2c823b4cc99bac5917ce9e678ef3e0aaff65a4726fff3d4"} Dec 03 11:33:02 crc kubenswrapper[4702]: I1203 11:33:02.128656 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" event={"ID":"45dcdd57-8d89-4094-a217-c6c58eeaba18","Type":"ContainerStarted","Data":"e92fe5918b532cb212bce0662f2e764c14b09c4b4180906870ec1370235ec064"} Dec 03 11:33:03 crc kubenswrapper[4702]: I1203 11:33:03.201693 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" event={"ID":"0099aaeb-f11f-492e-8962-e06a72b6878d","Type":"ContainerStarted","Data":"ab56ea1883615d9d6d926896a3e7135ece8877e2eb076527bbf64795372caad9"} Dec 03 11:33:03 crc kubenswrapper[4702]: I1203 11:33:03.246262 4702 generic.go:334] "Generic (PLEG): container finished" podID="45dcdd57-8d89-4094-a217-c6c58eeaba18" containerID="3502f437825d4954b2c823b4cc99bac5917ce9e678ef3e0aaff65a4726fff3d4" exitCode=0 Dec 03 11:33:03 crc kubenswrapper[4702]: I1203 11:33:03.248986 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" event={"ID":"45dcdd57-8d89-4094-a217-c6c58eeaba18","Type":"ContainerDied","Data":"3502f437825d4954b2c823b4cc99bac5917ce9e678ef3e0aaff65a4726fff3d4"} Dec 03 11:33:03 crc kubenswrapper[4702]: I1203 11:33:03.249058 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" event={"ID":"45dcdd57-8d89-4094-a217-c6c58eeaba18","Type":"ContainerStarted","Data":"ce2b5a886cfa45212e483b44b721c4d9a21de5eda8e5335141fbd27e4fdb44ab"} Dec 03 11:33:03 crc kubenswrapper[4702]: I1203 11:33:03.249105 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:33:03 crc kubenswrapper[4702]: I1203 11:33:03.266583 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" podStartSLOduration=3.266556691 podStartE2EDuration="3.266556691s" podCreationTimestamp="2025-12-03 11:33:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:03.231673107 +0000 UTC m=+1767.067601561" watchObservedRunningTime="2025-12-03 11:33:03.266556691 +0000 UTC m=+1767.102485155" Dec 03 11:33:03 crc kubenswrapper[4702]: I1203 11:33:03.688349 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" podStartSLOduration=5.688323584 podStartE2EDuration="5.688323584s" podCreationTimestamp="2025-12-03 11:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:03.581253594 +0000 UTC m=+1767.417182058" watchObservedRunningTime="2025-12-03 11:33:03.688323584 +0000 UTC m=+1767.524252048" Dec 03 11:33:04 crc kubenswrapper[4702]: I1203 11:33:04.291272 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:04 crc kubenswrapper[4702]: I1203 11:33:04.313525 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:33:05 crc kubenswrapper[4702]: I1203 11:33:05.783398 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.113932 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-7k9st" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.307935 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-combined-ca-bundle\") pod \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.308131 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjgtl\" (UniqueName: \"kubernetes.io/projected/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-kube-api-access-jjgtl\") pod \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.308176 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-config-data\") pod \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.308210 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-scripts\") pod \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\" (UID: \"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd\") " Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.324966 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-scripts" (OuterVolumeSpecName: "scripts") pod "4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" (UID: "4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.325060 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-kube-api-access-jjgtl" (OuterVolumeSpecName: "kube-api-access-jjgtl") pod "4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" (UID: "4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd"). InnerVolumeSpecName "kube-api-access-jjgtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.341423 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-7k9st" event={"ID":"4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd","Type":"ContainerDied","Data":"a662e3b4014ee2a5a969ebdc377f948bdb99901b1115fabe164cb6563b92ba88"} Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.341740 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a662e3b4014ee2a5a969ebdc377f948bdb99901b1115fabe164cb6563b92ba88" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.341485 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-7k9st" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.361010 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" (UID: "4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.388683 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-config-data" (OuterVolumeSpecName: "config-data") pod "4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" (UID: "4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.413976 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.414029 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjgtl\" (UniqueName: \"kubernetes.io/projected/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-kube-api-access-jjgtl\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.414045 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:07 crc kubenswrapper[4702]: I1203 11:33:07.414056 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:08 crc kubenswrapper[4702]: I1203 11:33:08.940061 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:33:08 crc kubenswrapper[4702]: E1203 11:33:08.949239 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.420645 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9c35933-cc2c-459b-a47f-90397aa62811","Type":"ContainerStarted","Data":"701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6"} Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.421185 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f9c35933-cc2c-459b-a47f-90397aa62811" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6" gracePeriod=30 Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.427977 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"069495e4-97b2-4505-879a-fdd8abd30cbc","Type":"ContainerStarted","Data":"0da365b9442e87cd3d7cd686e57795227b57d4aec66b2095b87ac1020907bb99"} Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.428267 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"069495e4-97b2-4505-879a-fdd8abd30cbc","Type":"ContainerStarted","Data":"e8fcfcc254b5fed223ef12ec2aacb3528ef0a03a51a92dfcd2d102bdaf0fa99e"} Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.428128 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerName="nova-metadata-log" containerID="cri-o://e8fcfcc254b5fed223ef12ec2aacb3528ef0a03a51a92dfcd2d102bdaf0fa99e" gracePeriod=30 Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.428355 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerName="nova-metadata-metadata" containerID="cri-o://0da365b9442e87cd3d7cd686e57795227b57d4aec66b2095b87ac1020907bb99" gracePeriod=30 Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.435094 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea","Type":"ContainerStarted","Data":"54d7c739209d80e47ca8ae00add84270be30e240ba6876e04826ed36b2f301ca"} Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.453381 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07f4cc75-4ffd-490a-914c-fb2b072db23b","Type":"ContainerStarted","Data":"5f29db9c716dada935c837bea16e5aaaa30d3d71750749541740e5782ea6a6c5"} Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.453436 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07f4cc75-4ffd-490a-914c-fb2b072db23b","Type":"ContainerStarted","Data":"bc1c5202b2f637355bdecdfa983df237a4c6468fe5cfd1f017227c51299a91c3"} Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.457914 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.483742 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.065931814 podStartE2EDuration="11.483711352s" podCreationTimestamp="2025-12-03 11:32:58 +0000 UTC" firstStartedPulling="2025-12-03 11:33:00.380052786 +0000 UTC m=+1764.215981250" lastFinishedPulling="2025-12-03 11:33:07.797832324 +0000 UTC m=+1771.633760788" observedRunningTime="2025-12-03 11:33:09.442243621 +0000 UTC m=+1773.278172095" watchObservedRunningTime="2025-12-03 11:33:09.483711352 +0000 UTC m=+1773.319639826" Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.503798 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.621692119 podStartE2EDuration="12.503769793s" podCreationTimestamp="2025-12-03 11:32:57 +0000 UTC" firstStartedPulling="2025-12-03 11:32:59.779016406 +0000 UTC m=+1763.614944870" lastFinishedPulling="2025-12-03 11:33:07.66109408 +0000 UTC m=+1771.497022544" observedRunningTime="2025-12-03 11:33:09.471274358 +0000 UTC m=+1773.307202822" watchObservedRunningTime="2025-12-03 11:33:09.503769793 +0000 UTC m=+1773.339698257" Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.530995 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.924377052 podStartE2EDuration="11.530963848s" podCreationTimestamp="2025-12-03 11:32:58 +0000 UTC" firstStartedPulling="2025-12-03 11:33:00.104421895 +0000 UTC m=+1763.940350349" lastFinishedPulling="2025-12-03 11:33:07.711008681 +0000 UTC m=+1771.546937145" observedRunningTime="2025-12-03 11:33:09.499266995 +0000 UTC m=+1773.335195459" watchObservedRunningTime="2025-12-03 11:33:09.530963848 +0000 UTC m=+1773.366892312" Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.593356 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.358092531 podStartE2EDuration="12.593332454s" podCreationTimestamp="2025-12-03 11:32:57 +0000 UTC" firstStartedPulling="2025-12-03 11:32:59.575067717 +0000 UTC m=+1763.410996181" lastFinishedPulling="2025-12-03 11:33:07.81030764 +0000 UTC m=+1771.646236104" observedRunningTime="2025-12-03 11:33:09.558905294 +0000 UTC m=+1773.394833778" watchObservedRunningTime="2025-12-03 11:33:09.593332454 +0000 UTC m=+1773.429260918" Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.635279 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-d6rbj"] Dec 03 11:33:09 crc kubenswrapper[4702]: I1203 11:33:09.635651 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" podUID="ab0f582e-799a-44ad-8529-6d0fe71490c2" containerName="dnsmasq-dns" containerID="cri-o://a086ed1a32b499a91115208ed57efcf42a850ef49dc6d9bfced49c9ed309109a" gracePeriod=10 Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.278937 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:10 crc kubenswrapper[4702]: E1203 11:33:10.279986 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" containerName="aodh-db-sync" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.280007 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" containerName="aodh-db-sync" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.280369 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" containerName="aodh-db-sync" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.282976 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.287659 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.288067 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.288138 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-swrfb" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.480815 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-config-data\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.481113 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x4xv\" (UniqueName: \"kubernetes.io/projected/6148de9b-9833-4042-ad7d-5e40b93369ea-kube-api-access-6x4xv\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.481161 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-scripts\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.481299 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.489684 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.515015 4702 generic.go:334] "Generic (PLEG): container finished" podID="ab0f582e-799a-44ad-8529-6d0fe71490c2" containerID="a086ed1a32b499a91115208ed57efcf42a850ef49dc6d9bfced49c9ed309109a" exitCode=0 Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.515100 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" event={"ID":"ab0f582e-799a-44ad-8529-6d0fe71490c2","Type":"ContainerDied","Data":"a086ed1a32b499a91115208ed57efcf42a850ef49dc6d9bfced49c9ed309109a"} Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.522029 4702 generic.go:334] "Generic (PLEG): container finished" podID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerID="e8fcfcc254b5fed223ef12ec2aacb3528ef0a03a51a92dfcd2d102bdaf0fa99e" exitCode=143 Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.523457 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"069495e4-97b2-4505-879a-fdd8abd30cbc","Type":"ContainerDied","Data":"e8fcfcc254b5fed223ef12ec2aacb3528ef0a03a51a92dfcd2d102bdaf0fa99e"} Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.584178 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x4xv\" (UniqueName: \"kubernetes.io/projected/6148de9b-9833-4042-ad7d-5e40b93369ea-kube-api-access-6x4xv\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.584246 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-scripts\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.584316 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.584442 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-config-data\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.598568 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-scripts\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.612731 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.613841 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-config-data\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.614045 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x4xv\" (UniqueName: \"kubernetes.io/projected/6148de9b-9833-4042-ad7d-5e40b93369ea-kube-api-access-6x4xv\") pod \"aodh-0\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " pod="openstack/aodh-0" Dec 03 11:33:10 crc kubenswrapper[4702]: I1203 11:33:10.781531 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.132602 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.311234 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-svc\") pod \"ab0f582e-799a-44ad-8529-6d0fe71490c2\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.311491 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72z4m\" (UniqueName: \"kubernetes.io/projected/ab0f582e-799a-44ad-8529-6d0fe71490c2-kube-api-access-72z4m\") pod \"ab0f582e-799a-44ad-8529-6d0fe71490c2\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.311653 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-nb\") pod \"ab0f582e-799a-44ad-8529-6d0fe71490c2\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.311699 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-config\") pod \"ab0f582e-799a-44ad-8529-6d0fe71490c2\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.311732 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-sb\") pod \"ab0f582e-799a-44ad-8529-6d0fe71490c2\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.312043 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-swift-storage-0\") pod \"ab0f582e-799a-44ad-8529-6d0fe71490c2\" (UID: \"ab0f582e-799a-44ad-8529-6d0fe71490c2\") " Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.323431 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0f582e-799a-44ad-8529-6d0fe71490c2-kube-api-access-72z4m" (OuterVolumeSpecName: "kube-api-access-72z4m") pod "ab0f582e-799a-44ad-8529-6d0fe71490c2" (UID: "ab0f582e-799a-44ad-8529-6d0fe71490c2"). InnerVolumeSpecName "kube-api-access-72z4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.439536 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72z4m\" (UniqueName: \"kubernetes.io/projected/ab0f582e-799a-44ad-8529-6d0fe71490c2-kube-api-access-72z4m\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.467984 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ab0f582e-799a-44ad-8529-6d0fe71490c2" (UID: "ab0f582e-799a-44ad-8529-6d0fe71490c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.511543 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ab0f582e-799a-44ad-8529-6d0fe71490c2" (UID: "ab0f582e-799a-44ad-8529-6d0fe71490c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.522451 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-config" (OuterVolumeSpecName: "config") pod "ab0f582e-799a-44ad-8529-6d0fe71490c2" (UID: "ab0f582e-799a-44ad-8529-6d0fe71490c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.543910 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab0f582e-799a-44ad-8529-6d0fe71490c2" (UID: "ab0f582e-799a-44ad-8529-6d0fe71490c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.544132 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.544168 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.544180 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.571695 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ab0f582e-799a-44ad-8529-6d0fe71490c2" (UID: "ab0f582e-799a-44ad-8529-6d0fe71490c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.580001 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" event={"ID":"ab0f582e-799a-44ad-8529-6d0fe71490c2","Type":"ContainerDied","Data":"6fb273780c501b5fa03deab78b9e2f3ddb4323078efb54dde935848679f4b17f"} Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.580080 4702 scope.go:117] "RemoveContainer" containerID="a086ed1a32b499a91115208ed57efcf42a850ef49dc6d9bfced49c9ed309109a" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.580318 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-d6rbj" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.600183 4702 generic.go:334] "Generic (PLEG): container finished" podID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerID="0da365b9442e87cd3d7cd686e57795227b57d4aec66b2095b87ac1020907bb99" exitCode=0 Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.600242 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"069495e4-97b2-4505-879a-fdd8abd30cbc","Type":"ContainerDied","Data":"0da365b9442e87cd3d7cd686e57795227b57d4aec66b2095b87ac1020907bb99"} Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.651742 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.652109 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab0f582e-799a-44ad-8529-6d0fe71490c2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.709825 4702 scope.go:117] "RemoveContainer" containerID="a8d69032f95d07a117914af342e611aebf5695f17b6460e6d5e7b2a9372378f4" Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.748090 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-d6rbj"] Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.766244 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-d6rbj"] Dec 03 11:33:11 crc kubenswrapper[4702]: I1203 11:33:11.997090 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.071617 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.179143 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-config-data\") pod \"069495e4-97b2-4505-879a-fdd8abd30cbc\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.179241 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tdmh\" (UniqueName: \"kubernetes.io/projected/069495e4-97b2-4505-879a-fdd8abd30cbc-kube-api-access-7tdmh\") pod \"069495e4-97b2-4505-879a-fdd8abd30cbc\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.179365 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069495e4-97b2-4505-879a-fdd8abd30cbc-logs\") pod \"069495e4-97b2-4505-879a-fdd8abd30cbc\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.179870 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/069495e4-97b2-4505-879a-fdd8abd30cbc-logs" (OuterVolumeSpecName: "logs") pod "069495e4-97b2-4505-879a-fdd8abd30cbc" (UID: "069495e4-97b2-4505-879a-fdd8abd30cbc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.179932 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-combined-ca-bundle\") pod \"069495e4-97b2-4505-879a-fdd8abd30cbc\" (UID: \"069495e4-97b2-4505-879a-fdd8abd30cbc\") " Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.182942 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/069495e4-97b2-4505-879a-fdd8abd30cbc-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.187952 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/069495e4-97b2-4505-879a-fdd8abd30cbc-kube-api-access-7tdmh" (OuterVolumeSpecName: "kube-api-access-7tdmh") pod "069495e4-97b2-4505-879a-fdd8abd30cbc" (UID: "069495e4-97b2-4505-879a-fdd8abd30cbc"). InnerVolumeSpecName "kube-api-access-7tdmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.237012 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-config-data" (OuterVolumeSpecName: "config-data") pod "069495e4-97b2-4505-879a-fdd8abd30cbc" (UID: "069495e4-97b2-4505-879a-fdd8abd30cbc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.243903 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "069495e4-97b2-4505-879a-fdd8abd30cbc" (UID: "069495e4-97b2-4505-879a-fdd8abd30cbc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.285628 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.285680 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tdmh\" (UniqueName: \"kubernetes.io/projected/069495e4-97b2-4505-879a-fdd8abd30cbc-kube-api-access-7tdmh\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.285694 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/069495e4-97b2-4505-879a-fdd8abd30cbc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.625909 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerStarted","Data":"431c3ad555728bd0adfac4ebc137449a7d0f0b85f1406b302fa4f816b8bb5cf5"} Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.629027 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"069495e4-97b2-4505-879a-fdd8abd30cbc","Type":"ContainerDied","Data":"d930b26fb085de9de6b9863c4b04fdcfbfe3f37a4c0c9b5c6a64d044b830bc5a"} Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.629084 4702 scope.go:117] "RemoveContainer" containerID="0da365b9442e87cd3d7cd686e57795227b57d4aec66b2095b87ac1020907bb99" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.629220 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.840294 4702 scope.go:117] "RemoveContainer" containerID="e8fcfcc254b5fed223ef12ec2aacb3528ef0a03a51a92dfcd2d102bdaf0fa99e" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.875501 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.900035 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.914110 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:12 crc kubenswrapper[4702]: E1203 11:33:12.914867 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0f582e-799a-44ad-8529-6d0fe71490c2" containerName="init" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.914892 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0f582e-799a-44ad-8529-6d0fe71490c2" containerName="init" Dec 03 11:33:12 crc kubenswrapper[4702]: E1203 11:33:12.914934 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerName="nova-metadata-metadata" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.914942 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerName="nova-metadata-metadata" Dec 03 11:33:12 crc kubenswrapper[4702]: E1203 11:33:12.914966 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerName="nova-metadata-log" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.914976 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerName="nova-metadata-log" Dec 03 11:33:12 crc kubenswrapper[4702]: E1203 11:33:12.914998 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0f582e-799a-44ad-8529-6d0fe71490c2" containerName="dnsmasq-dns" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.915005 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0f582e-799a-44ad-8529-6d0fe71490c2" containerName="dnsmasq-dns" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.915317 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerName="nova-metadata-metadata" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.915354 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" containerName="nova-metadata-log" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.915391 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0f582e-799a-44ad-8529-6d0fe71490c2" containerName="dnsmasq-dns" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.917401 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.929398 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.930307 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.964490 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="069495e4-97b2-4505-879a-fdd8abd30cbc" path="/var/lib/kubelet/pods/069495e4-97b2-4505-879a-fdd8abd30cbc/volumes" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.967845 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab0f582e-799a-44ad-8529-6d0fe71490c2" path="/var/lib/kubelet/pods/ab0f582e-799a-44ad-8529-6d0fe71490c2/volumes" Dec 03 11:33:12 crc kubenswrapper[4702]: I1203 11:33:12.968697 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.007065 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-logs\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.007131 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.007596 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dqtk\" (UniqueName: \"kubernetes.io/projected/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-kube-api-access-5dqtk\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.007963 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.008288 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-config-data\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.110736 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dqtk\" (UniqueName: \"kubernetes.io/projected/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-kube-api-access-5dqtk\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.110885 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.110958 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-config-data\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.110989 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-logs\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.111013 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.111531 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-logs\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.117558 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.118854 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-config-data\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.131060 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.132326 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dqtk\" (UniqueName: \"kubernetes.io/projected/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-kube-api-access-5dqtk\") pod \"nova-metadata-0\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.247516 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.595231 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 11:33:13 crc kubenswrapper[4702]: I1203 11:33:13.667921 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerStarted","Data":"2d1833a0368e77342fa5f6686847efaa7708f9c2d38570bb1a7f9036e9c6a75d"} Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.035212 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:14 crc kubenswrapper[4702]: W1203 11:33:14.058978 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5984ffb3_b39a_47cc_bcc9_3f35fae00dcb.slice/crio-96081fd372090ca731a4503c8c2115fd303ab63fb656d369887cb15977630718 WatchSource:0}: Error finding container 96081fd372090ca731a4503c8c2115fd303ab63fb656d369887cb15977630718: Status 404 returned error can't find the container with id 96081fd372090ca731a4503c8c2115fd303ab63fb656d369887cb15977630718 Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.191246 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.650075 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.650450 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="ceilometer-central-agent" containerID="cri-o://da2cf936030d0d275bb26af563c87689810ee45bb27e315e27cb64f9485a5345" gracePeriod=30 Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.650549 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="proxy-httpd" containerID="cri-o://dcf6785b83faf408fd535b940b429e547c1305962b06ff48f7d1da6665fca818" gracePeriod=30 Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.650606 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="sg-core" containerID="cri-o://0665a7640d7f6ac94d78200f0d1596a3c99e0d2eee6f6711b57d8d343371c74b" gracePeriod=30 Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.650654 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="ceilometer-notification-agent" containerID="cri-o://8ce8524b216fff9e24b3ce10dab7a1c01ee118311442090011ae2771ac664f4b" gracePeriod=30 Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.702807 4702 generic.go:334] "Generic (PLEG): container finished" podID="f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" containerID="382162c87abd8e28697a3509b316313c53439cda66e41bbf8b631b0f466b2893" exitCode=0 Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.702990 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dq5nh" event={"ID":"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f","Type":"ContainerDied","Data":"382162c87abd8e28697a3509b316313c53439cda66e41bbf8b631b0f466b2893"} Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.715199 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb","Type":"ContainerStarted","Data":"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9"} Dec 03 11:33:14 crc kubenswrapper[4702]: I1203 11:33:14.715255 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb","Type":"ContainerStarted","Data":"96081fd372090ca731a4503c8c2115fd303ab63fb656d369887cb15977630718"} Dec 03 11:33:15 crc kubenswrapper[4702]: I1203 11:33:15.750459 4702 generic.go:334] "Generic (PLEG): container finished" podID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerID="dcf6785b83faf408fd535b940b429e547c1305962b06ff48f7d1da6665fca818" exitCode=0 Dec 03 11:33:15 crc kubenswrapper[4702]: I1203 11:33:15.751843 4702 generic.go:334] "Generic (PLEG): container finished" podID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerID="0665a7640d7f6ac94d78200f0d1596a3c99e0d2eee6f6711b57d8d343371c74b" exitCode=2 Dec 03 11:33:15 crc kubenswrapper[4702]: I1203 11:33:15.751865 4702 generic.go:334] "Generic (PLEG): container finished" podID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerID="da2cf936030d0d275bb26af563c87689810ee45bb27e315e27cb64f9485a5345" exitCode=0 Dec 03 11:33:15 crc kubenswrapper[4702]: I1203 11:33:15.750537 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerDied","Data":"dcf6785b83faf408fd535b940b429e547c1305962b06ff48f7d1da6665fca818"} Dec 03 11:33:15 crc kubenswrapper[4702]: I1203 11:33:15.751980 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerDied","Data":"0665a7640d7f6ac94d78200f0d1596a3c99e0d2eee6f6711b57d8d343371c74b"} Dec 03 11:33:15 crc kubenswrapper[4702]: I1203 11:33:15.752010 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerDied","Data":"da2cf936030d0d275bb26af563c87689810ee45bb27e315e27cb64f9485a5345"} Dec 03 11:33:15 crc kubenswrapper[4702]: I1203 11:33:15.757844 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerStarted","Data":"209908441ff64d788b841146150f0e1ed97b1c2aecebd4447c3e03f092dcbc46"} Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.069649 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.211178 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.332489 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh8rr\" (UniqueName: \"kubernetes.io/projected/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-kube-api-access-lh8rr\") pod \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.332641 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-combined-ca-bundle\") pod \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.332710 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-config-data\") pod \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.333158 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-scripts\") pod \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\" (UID: \"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f\") " Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.342244 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-scripts" (OuterVolumeSpecName: "scripts") pod "f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" (UID: "f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.342278 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-kube-api-access-lh8rr" (OuterVolumeSpecName: "kube-api-access-lh8rr") pod "f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" (UID: "f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f"). InnerVolumeSpecName "kube-api-access-lh8rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.379623 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-config-data" (OuterVolumeSpecName: "config-data") pod "f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" (UID: "f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.395021 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" (UID: "f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.437784 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.437821 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh8rr\" (UniqueName: \"kubernetes.io/projected/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-kube-api-access-lh8rr\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.437837 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.437848 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.799842 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb","Type":"ContainerStarted","Data":"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67"} Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.807855 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-dq5nh" event={"ID":"f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f","Type":"ContainerDied","Data":"aa71b16534d50113416aea18dd7c68e7637af04fc2fa8d338ede4ffc41e2cfbd"} Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.807915 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa71b16534d50113416aea18dd7c68e7637af04fc2fa8d338ede4ffc41e2cfbd" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.807997 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-dq5nh" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.816286 4702 generic.go:334] "Generic (PLEG): container finished" podID="0099aaeb-f11f-492e-8962-e06a72b6878d" containerID="ab56ea1883615d9d6d926896a3e7135ece8877e2eb076527bbf64795372caad9" exitCode=0 Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.816363 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" event={"ID":"0099aaeb-f11f-492e-8962-e06a72b6878d","Type":"ContainerDied","Data":"ab56ea1883615d9d6d926896a3e7135ece8877e2eb076527bbf64795372caad9"} Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.841005 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.840968867 podStartE2EDuration="4.840968867s" podCreationTimestamp="2025-12-03 11:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:16.819750412 +0000 UTC m=+1780.655678886" watchObservedRunningTime="2025-12-03 11:33:16.840968867 +0000 UTC m=+1780.676897331" Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.924919 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:33:16 crc kubenswrapper[4702]: I1203 11:33:16.925160 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" containerName="nova-scheduler-scheduler" containerID="cri-o://54d7c739209d80e47ca8ae00add84270be30e240ba6876e04826ed36b2f301ca" gracePeriod=30 Dec 03 11:33:17 crc kubenswrapper[4702]: I1203 11:33:17.004271 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:17 crc kubenswrapper[4702]: I1203 11:33:17.004336 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:17 crc kubenswrapper[4702]: I1203 11:33:17.004583 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerName="nova-api-log" containerID="cri-o://bc1c5202b2f637355bdecdfa983df237a4c6468fe5cfd1f017227c51299a91c3" gracePeriod=30 Dec 03 11:33:17 crc kubenswrapper[4702]: I1203 11:33:17.005962 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerName="nova-api-api" containerID="cri-o://5f29db9c716dada935c837bea16e5aaaa30d3d71750749541740e5782ea6a6c5" gracePeriod=30 Dec 03 11:33:17 crc kubenswrapper[4702]: I1203 11:33:17.874378 4702 generic.go:334] "Generic (PLEG): container finished" podID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerID="5f29db9c716dada935c837bea16e5aaaa30d3d71750749541740e5782ea6a6c5" exitCode=0 Dec 03 11:33:17 crc kubenswrapper[4702]: I1203 11:33:17.874421 4702 generic.go:334] "Generic (PLEG): container finished" podID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerID="bc1c5202b2f637355bdecdfa983df237a4c6468fe5cfd1f017227c51299a91c3" exitCode=143 Dec 03 11:33:17 crc kubenswrapper[4702]: I1203 11:33:17.874414 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07f4cc75-4ffd-490a-914c-fb2b072db23b","Type":"ContainerDied","Data":"5f29db9c716dada935c837bea16e5aaaa30d3d71750749541740e5782ea6a6c5"} Dec 03 11:33:17 crc kubenswrapper[4702]: I1203 11:33:17.874455 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07f4cc75-4ffd-490a-914c-fb2b072db23b","Type":"ContainerDied","Data":"bc1c5202b2f637355bdecdfa983df237a4c6468fe5cfd1f017227c51299a91c3"} Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.248133 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.248557 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 11:33:18 crc kubenswrapper[4702]: E1203 11:33:18.255386 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f1a2cc7_b3b5_4147_8cdd_e5c8820d5fea.slice/crio-conmon-54d7c739209d80e47ca8ae00add84270be30e240ba6876e04826ed36b2f301ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.572335 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.767577 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-scripts\") pod \"0099aaeb-f11f-492e-8962-e06a72b6878d\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.767932 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-combined-ca-bundle\") pod \"0099aaeb-f11f-492e-8962-e06a72b6878d\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.768096 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrrjh\" (UniqueName: \"kubernetes.io/projected/0099aaeb-f11f-492e-8962-e06a72b6878d-kube-api-access-jrrjh\") pod \"0099aaeb-f11f-492e-8962-e06a72b6878d\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.768351 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-config-data\") pod \"0099aaeb-f11f-492e-8962-e06a72b6878d\" (UID: \"0099aaeb-f11f-492e-8962-e06a72b6878d\") " Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.777531 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0099aaeb-f11f-492e-8962-e06a72b6878d-kube-api-access-jrrjh" (OuterVolumeSpecName: "kube-api-access-jrrjh") pod "0099aaeb-f11f-492e-8962-e06a72b6878d" (UID: "0099aaeb-f11f-492e-8962-e06a72b6878d"). InnerVolumeSpecName "kube-api-access-jrrjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.779632 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-scripts" (OuterVolumeSpecName: "scripts") pod "0099aaeb-f11f-492e-8962-e06a72b6878d" (UID: "0099aaeb-f11f-492e-8962-e06a72b6878d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.820505 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-config-data" (OuterVolumeSpecName: "config-data") pod "0099aaeb-f11f-492e-8962-e06a72b6878d" (UID: "0099aaeb-f11f-492e-8962-e06a72b6878d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.825971 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0099aaeb-f11f-492e-8962-e06a72b6878d" (UID: "0099aaeb-f11f-492e-8962-e06a72b6878d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.874482 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.874515 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.874544 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrrjh\" (UniqueName: \"kubernetes.io/projected/0099aaeb-f11f-492e-8962-e06a72b6878d-kube-api-access-jrrjh\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.874553 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0099aaeb-f11f-492e-8962-e06a72b6878d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.959450 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.980094 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07f4cc75-4ffd-490a-914c-fb2b072db23b","Type":"ContainerDied","Data":"7fcdad95685a95f1b1815d055ffdaf24f08d04184d6fbe4e54fb08f1e4aee9db"} Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.980150 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerStarted","Data":"d91e72e8f66c9bd3e87823654f464d7cf4409cb8f4d86bcf9adb2e3321b7c30b"} Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.980177 4702 scope.go:117] "RemoveContainer" containerID="5f29db9c716dada935c837bea16e5aaaa30d3d71750749541740e5782ea6a6c5" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.983485 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.983587 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.983476 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-jzcdc" event={"ID":"0099aaeb-f11f-492e-8962-e06a72b6878d","Type":"ContainerDied","Data":"473b65aeabcdea48fcf0f42b0bb20c3bb3dda68ca3828559d66f0d82a71c2bf4"} Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.984519 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="473b65aeabcdea48fcf0f42b0bb20c3bb3dda68ca3828559d66f0d82a71c2bf4" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.987810 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 11:33:18 crc kubenswrapper[4702]: E1203 11:33:18.988337 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" containerName="nova-manage" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988367 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" containerName="nova-manage" Dec 03 11:33:18 crc kubenswrapper[4702]: E1203 11:33:18.988390 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" containerName="nova-scheduler-scheduler" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988397 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" containerName="nova-scheduler-scheduler" Dec 03 11:33:18 crc kubenswrapper[4702]: E1203 11:33:18.988433 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0099aaeb-f11f-492e-8962-e06a72b6878d" containerName="nova-cell1-conductor-db-sync" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988440 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0099aaeb-f11f-492e-8962-e06a72b6878d" containerName="nova-cell1-conductor-db-sync" Dec 03 11:33:18 crc kubenswrapper[4702]: E1203 11:33:18.988463 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerName="nova-api-api" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988472 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerName="nova-api-api" Dec 03 11:33:18 crc kubenswrapper[4702]: E1203 11:33:18.988484 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerName="nova-api-log" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988493 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerName="nova-api-log" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988692 4702 generic.go:334] "Generic (PLEG): container finished" podID="5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" containerID="54d7c739209d80e47ca8ae00add84270be30e240ba6876e04826ed36b2f301ca" exitCode=0 Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988740 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerName="nova-api-api" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988775 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" containerName="nova-scheduler-scheduler" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988788 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" containerName="nova-api-log" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988799 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="0099aaeb-f11f-492e-8962-e06a72b6878d" containerName="nova-cell1-conductor-db-sync" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.988809 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" containerName="nova-manage" Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.989011 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerName="nova-metadata-log" containerID="cri-o://5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9" gracePeriod=30 Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.989250 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerName="nova-metadata-metadata" containerID="cri-o://4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67" gracePeriod=30 Dec 03 11:33:18 crc kubenswrapper[4702]: I1203 11:33:18.989665 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea","Type":"ContainerDied","Data":"54d7c739209d80e47ca8ae00add84270be30e240ba6876e04826ed36b2f301ca"} Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.012574 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.020639 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.031822 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.075328 4702 scope.go:117] "RemoveContainer" containerID="bc1c5202b2f637355bdecdfa983df237a4c6468fe5cfd1f017227c51299a91c3" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.080414 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-combined-ca-bundle\") pod \"07f4cc75-4ffd-490a-914c-fb2b072db23b\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.080537 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-config-data\") pod \"07f4cc75-4ffd-490a-914c-fb2b072db23b\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.080616 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn5h6\" (UniqueName: \"kubernetes.io/projected/07f4cc75-4ffd-490a-914c-fb2b072db23b-kube-api-access-nn5h6\") pod \"07f4cc75-4ffd-490a-914c-fb2b072db23b\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.080780 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07f4cc75-4ffd-490a-914c-fb2b072db23b-logs\") pod \"07f4cc75-4ffd-490a-914c-fb2b072db23b\" (UID: \"07f4cc75-4ffd-490a-914c-fb2b072db23b\") " Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.090936 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07f4cc75-4ffd-490a-914c-fb2b072db23b-logs" (OuterVolumeSpecName: "logs") pod "07f4cc75-4ffd-490a-914c-fb2b072db23b" (UID: "07f4cc75-4ffd-490a-914c-fb2b072db23b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.100258 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f4cc75-4ffd-490a-914c-fb2b072db23b-kube-api-access-nn5h6" (OuterVolumeSpecName: "kube-api-access-nn5h6") pod "07f4cc75-4ffd-490a-914c-fb2b072db23b" (UID: "07f4cc75-4ffd-490a-914c-fb2b072db23b"). InnerVolumeSpecName "kube-api-access-nn5h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.154568 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-config-data" (OuterVolumeSpecName: "config-data") pod "07f4cc75-4ffd-490a-914c-fb2b072db23b" (UID: "07f4cc75-4ffd-490a-914c-fb2b072db23b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.159994 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07f4cc75-4ffd-490a-914c-fb2b072db23b" (UID: "07f4cc75-4ffd-490a-914c-fb2b072db23b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.183098 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-combined-ca-bundle\") pod \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.183230 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khhpn\" (UniqueName: \"kubernetes.io/projected/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-kube-api-access-khhpn\") pod \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.183307 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-config-data\") pod \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\" (UID: \"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea\") " Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.184251 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d234e0-f641-485a-90aa-8440c5d00296-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.184448 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4z6t\" (UniqueName: \"kubernetes.io/projected/b9d234e0-f641-485a-90aa-8440c5d00296-kube-api-access-k4z6t\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.184527 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d234e0-f641-485a-90aa-8440c5d00296-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.184702 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.184726 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07f4cc75-4ffd-490a-914c-fb2b072db23b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.184739 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn5h6\" (UniqueName: \"kubernetes.io/projected/07f4cc75-4ffd-490a-914c-fb2b072db23b-kube-api-access-nn5h6\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.184772 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07f4cc75-4ffd-490a-914c-fb2b072db23b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.187947 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-kube-api-access-khhpn" (OuterVolumeSpecName: "kube-api-access-khhpn") pod "5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" (UID: "5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea"). InnerVolumeSpecName "kube-api-access-khhpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.217092 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-config-data" (OuterVolumeSpecName: "config-data") pod "5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" (UID: "5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.226542 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" (UID: "5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.286957 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d234e0-f641-485a-90aa-8440c5d00296-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.287287 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4z6t\" (UniqueName: \"kubernetes.io/projected/b9d234e0-f641-485a-90aa-8440c5d00296-kube-api-access-k4z6t\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.288364 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d234e0-f641-485a-90aa-8440c5d00296-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.288884 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.289037 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khhpn\" (UniqueName: \"kubernetes.io/projected/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-kube-api-access-khhpn\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.289117 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.291211 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d234e0-f641-485a-90aa-8440c5d00296-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.291874 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d234e0-f641-485a-90aa-8440c5d00296-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.306426 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4z6t\" (UniqueName: \"kubernetes.io/projected/b9d234e0-f641-485a-90aa-8440c5d00296-kube-api-access-k4z6t\") pod \"nova-cell1-conductor-0\" (UID: \"b9d234e0-f641-485a-90aa-8440c5d00296\") " pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.367130 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.420710 4702 scope.go:117] "RemoveContainer" containerID="54d7c739209d80e47ca8ae00add84270be30e240ba6876e04826ed36b2f301ca" Dec 03 11:33:19 crc kubenswrapper[4702]: I1203 11:33:19.907612 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.201433 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.245417 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.262479 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b9d234e0-f641-485a-90aa-8440c5d00296","Type":"ContainerStarted","Data":"d455b08a012e3ecc40f469e1754736cd725441f3f7c1090beb35506428e872e5"} Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.270812 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea","Type":"ContainerDied","Data":"7601771d58d1344c83e88df97dbf4a639e3cda7e97d2e9ec95d8213faa8a7033"} Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.271966 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.284656 4702 generic.go:334] "Generic (PLEG): container finished" podID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerID="4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67" exitCode=0 Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.284717 4702 generic.go:334] "Generic (PLEG): container finished" podID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerID="5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9" exitCode=143 Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.285242 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.287025 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb","Type":"ContainerDied","Data":"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67"} Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.287139 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb","Type":"ContainerDied","Data":"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9"} Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.287218 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb","Type":"ContainerDied","Data":"96081fd372090ca731a4503c8c2115fd303ab63fb656d369887cb15977630718"} Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.287301 4702 scope.go:117] "RemoveContainer" containerID="4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.349124 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-config-data\") pod \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.349300 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dqtk\" (UniqueName: \"kubernetes.io/projected/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-kube-api-access-5dqtk\") pod \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.349527 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-combined-ca-bundle\") pod \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.349567 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-logs\") pod \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.349618 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-nova-metadata-tls-certs\") pod \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\" (UID: \"5984ffb3-b39a-47cc-bcc9-3f35fae00dcb\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.356649 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-logs" (OuterVolumeSpecName: "logs") pod "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" (UID: "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.360433 4702 generic.go:334] "Generic (PLEG): container finished" podID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerID="8ce8524b216fff9e24b3ce10dab7a1c01ee118311442090011ae2771ac664f4b" exitCode=0 Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.360507 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerDied","Data":"8ce8524b216fff9e24b3ce10dab7a1c01ee118311442090011ae2771ac664f4b"} Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.360977 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-kube-api-access-5dqtk" (OuterVolumeSpecName: "kube-api-access-5dqtk") pod "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" (UID: "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb"). InnerVolumeSpecName "kube-api-access-5dqtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.396537 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" (UID: "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.420211 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-config-data" (OuterVolumeSpecName: "config-data") pod "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" (UID: "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.420484 4702 scope.go:117] "RemoveContainer" containerID="5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.456218 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.456248 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dqtk\" (UniqueName: \"kubernetes.io/projected/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-kube-api-access-5dqtk\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.456260 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.456275 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.494089 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" (UID: "5984ffb3-b39a-47cc-bcc9-3f35fae00dcb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.559972 4702 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.607883 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.649421 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.666149 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-run-httpd\") pod \"fefa4548-587a-4fd5-b49a-727241aa7c24\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.666254 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-config-data\") pod \"fefa4548-587a-4fd5-b49a-727241aa7c24\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.666281 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-scripts\") pod \"fefa4548-587a-4fd5-b49a-727241aa7c24\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.666389 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-combined-ca-bundle\") pod \"fefa4548-587a-4fd5-b49a-727241aa7c24\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.666520 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-sg-core-conf-yaml\") pod \"fefa4548-587a-4fd5-b49a-727241aa7c24\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.666567 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-log-httpd\") pod \"fefa4548-587a-4fd5-b49a-727241aa7c24\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.666633 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvh8c\" (UniqueName: \"kubernetes.io/projected/fefa4548-587a-4fd5-b49a-727241aa7c24-kube-api-access-lvh8c\") pod \"fefa4548-587a-4fd5-b49a-727241aa7c24\" (UID: \"fefa4548-587a-4fd5-b49a-727241aa7c24\") " Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.669336 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fefa4548-587a-4fd5-b49a-727241aa7c24" (UID: "fefa4548-587a-4fd5-b49a-727241aa7c24"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.669693 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fefa4548-587a-4fd5-b49a-727241aa7c24" (UID: "fefa4548-587a-4fd5-b49a-727241aa7c24"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.681063 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-scripts" (OuterVolumeSpecName: "scripts") pod "fefa4548-587a-4fd5-b49a-727241aa7c24" (UID: "fefa4548-587a-4fd5-b49a-727241aa7c24"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.694040 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fefa4548-587a-4fd5-b49a-727241aa7c24-kube-api-access-lvh8c" (OuterVolumeSpecName: "kube-api-access-lvh8c") pod "fefa4548-587a-4fd5-b49a-727241aa7c24" (UID: "fefa4548-587a-4fd5-b49a-727241aa7c24"). InnerVolumeSpecName "kube-api-access-lvh8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.702788 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.724954 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fefa4548-587a-4fd5-b49a-727241aa7c24" (UID: "fefa4548-587a-4fd5-b49a-727241aa7c24"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.731844 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.770608 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvh8c\" (UniqueName: \"kubernetes.io/projected/fefa4548-587a-4fd5-b49a-727241aa7c24-kube-api-access-lvh8c\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.770648 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.770657 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.770668 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.770676 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fefa4548-587a-4fd5-b49a-727241aa7c24-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.784223 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.805704 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: E1203 11:33:20.806452 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="ceilometer-central-agent" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.806474 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="ceilometer-central-agent" Dec 03 11:33:20 crc kubenswrapper[4702]: E1203 11:33:20.806528 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="sg-core" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.806537 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="sg-core" Dec 03 11:33:20 crc kubenswrapper[4702]: E1203 11:33:20.806555 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="ceilometer-notification-agent" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.806564 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="ceilometer-notification-agent" Dec 03 11:33:20 crc kubenswrapper[4702]: E1203 11:33:20.806595 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerName="nova-metadata-log" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.806603 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerName="nova-metadata-log" Dec 03 11:33:20 crc kubenswrapper[4702]: E1203 11:33:20.806624 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerName="nova-metadata-metadata" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.806632 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerName="nova-metadata-metadata" Dec 03 11:33:20 crc kubenswrapper[4702]: E1203 11:33:20.806645 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="proxy-httpd" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.806653 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="proxy-httpd" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.807056 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="ceilometer-central-agent" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.807077 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="proxy-httpd" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.807099 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="ceilometer-notification-agent" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.807108 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerName="nova-metadata-log" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.807153 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" containerName="nova-metadata-metadata" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.807169 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" containerName="sg-core" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.809377 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.811279 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.827680 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.829952 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.833803 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.840033 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.873274 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.873344 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-config-data\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.873378 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-config-data\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.873419 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltm95\" (UniqueName: \"kubernetes.io/projected/49564875-fe0f-45b7-b72f-f9583f4a80be-kube-api-access-ltm95\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.873469 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a66cfe-1f78-4bc2-854d-11003680f6ca-logs\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.873604 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v697h\" (UniqueName: \"kubernetes.io/projected/91a66cfe-1f78-4bc2-854d-11003680f6ca-kube-api-access-v697h\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.873847 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.879912 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.890813 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.901888 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.913643 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.916887 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.920809 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.920919 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.930482 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.937588 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:33:20 crc kubenswrapper[4702]: E1203 11:33:20.938025 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.993685 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6120d0cd-f573-4f9c-af92-77580df0916c-logs\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994224 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994337 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994414 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-config-data\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994486 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhpvm\" (UniqueName: \"kubernetes.io/projected/6120d0cd-f573-4f9c-af92-77580df0916c-kube-api-access-qhpvm\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994684 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994724 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-config-data\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994776 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-config-data\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994865 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltm95\" (UniqueName: \"kubernetes.io/projected/49564875-fe0f-45b7-b72f-f9583f4a80be-kube-api-access-ltm95\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994889 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.994983 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a66cfe-1f78-4bc2-854d-11003680f6ca-logs\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:20 crc kubenswrapper[4702]: I1203 11:33:20.995188 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v697h\" (UniqueName: \"kubernetes.io/projected/91a66cfe-1f78-4bc2-854d-11003680f6ca-kube-api-access-v697h\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.000386 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a66cfe-1f78-4bc2-854d-11003680f6ca-logs\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.000890 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.005539 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f4cc75-4ffd-490a-914c-fb2b072db23b" path="/var/lib/kubelet/pods/07f4cc75-4ffd-490a-914c-fb2b072db23b/volumes" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.006638 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5984ffb3-b39a-47cc-bcc9-3f35fae00dcb" path="/var/lib/kubelet/pods/5984ffb3-b39a-47cc-bcc9-3f35fae00dcb/volumes" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.010695 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea" path="/var/lib/kubelet/pods/5f1a2cc7-b3b5-4147-8cdd-e5c8820d5fea/volumes" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.017410 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-config-data\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.018520 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-config-data\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.024002 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v697h\" (UniqueName: \"kubernetes.io/projected/91a66cfe-1f78-4bc2-854d-11003680f6ca-kube-api-access-v697h\") pod \"nova-api-0\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " pod="openstack/nova-api-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.027668 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.030340 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltm95\" (UniqueName: \"kubernetes.io/projected/49564875-fe0f-45b7-b72f-f9583f4a80be-kube-api-access-ltm95\") pod \"nova-scheduler-0\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " pod="openstack/nova-scheduler-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.073641 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fefa4548-587a-4fd5-b49a-727241aa7c24" (UID: "fefa4548-587a-4fd5-b49a-727241aa7c24"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.097391 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.097455 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-config-data\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.097485 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhpvm\" (UniqueName: \"kubernetes.io/projected/6120d0cd-f573-4f9c-af92-77580df0916c-kube-api-access-qhpvm\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.097567 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.097678 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6120d0cd-f573-4f9c-af92-77580df0916c-logs\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.097736 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.098160 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6120d0cd-f573-4f9c-af92-77580df0916c-logs\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.100048 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-config-data" (OuterVolumeSpecName: "config-data") pod "fefa4548-587a-4fd5-b49a-727241aa7c24" (UID: "fefa4548-587a-4fd5-b49a-727241aa7c24"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.102864 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.103504 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-config-data\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.104616 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.116060 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhpvm\" (UniqueName: \"kubernetes.io/projected/6120d0cd-f573-4f9c-af92-77580df0916c-kube-api-access-qhpvm\") pod \"nova-metadata-0\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.146461 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.168357 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.200545 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fefa4548-587a-4fd5-b49a-727241aa7c24-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.249543 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.318743 4702 scope.go:117] "RemoveContainer" containerID="4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67" Dec 03 11:33:21 crc kubenswrapper[4702]: E1203 11:33:21.319660 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67\": container with ID starting with 4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67 not found: ID does not exist" containerID="4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.319697 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67"} err="failed to get container status \"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67\": rpc error: code = NotFound desc = could not find container \"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67\": container with ID starting with 4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67 not found: ID does not exist" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.319728 4702 scope.go:117] "RemoveContainer" containerID="5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9" Dec 03 11:33:21 crc kubenswrapper[4702]: E1203 11:33:21.320036 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9\": container with ID starting with 5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9 not found: ID does not exist" containerID="5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.320069 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9"} err="failed to get container status \"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9\": rpc error: code = NotFound desc = could not find container \"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9\": container with ID starting with 5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9 not found: ID does not exist" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.320086 4702 scope.go:117] "RemoveContainer" containerID="4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.320288 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67"} err="failed to get container status \"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67\": rpc error: code = NotFound desc = could not find container \"4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67\": container with ID starting with 4c330d989ea5b955240b297a185a991799d1a446b722d0749102bfd61ff6af67 not found: ID does not exist" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.320307 4702 scope.go:117] "RemoveContainer" containerID="5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.320486 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9"} err="failed to get container status \"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9\": rpc error: code = NotFound desc = could not find container \"5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9\": container with ID starting with 5478c07571b53c73c5fe6bdf7d00f2cfd78d61d082db361ab7c6e614e8e00db9 not found: ID does not exist" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.386896 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b9d234e0-f641-485a-90aa-8440c5d00296","Type":"ContainerStarted","Data":"199722fdbf31157fb0e0a22b28d214f621138aa06d45a720cb0c6017826d6f3a"} Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.388544 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.395169 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.395201 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fefa4548-587a-4fd5-b49a-727241aa7c24","Type":"ContainerDied","Data":"7966f66f7bed970227eb6827adefcb630768830d991cc02d7abbec344ac201e0"} Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.405364 4702 scope.go:117] "RemoveContainer" containerID="dcf6785b83faf408fd535b940b429e547c1305962b06ff48f7d1da6665fca818" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.422123 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.42209281 podStartE2EDuration="3.42209281s" podCreationTimestamp="2025-12-03 11:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:21.409089129 +0000 UTC m=+1785.245017623" watchObservedRunningTime="2025-12-03 11:33:21.42209281 +0000 UTC m=+1785.258021294" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.578185 4702 scope.go:117] "RemoveContainer" containerID="0665a7640d7f6ac94d78200f0d1596a3c99e0d2eee6f6711b57d8d343371c74b" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.671375 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.697364 4702 scope.go:117] "RemoveContainer" containerID="8ce8524b216fff9e24b3ce10dab7a1c01ee118311442090011ae2771ac664f4b" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.702529 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.738522 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.750893 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.754576 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.755297 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.803163 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.814066 4702 scope.go:117] "RemoveContainer" containerID="da2cf936030d0d275bb26af563c87689810ee45bb27e315e27cb64f9485a5345" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.847932 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-scripts\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.848273 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.848476 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-config-data\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.848666 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jdr5\" (UniqueName: \"kubernetes.io/projected/77610d2a-c3c8-4e57-8631-c56e39176cc7-kube-api-access-7jdr5\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.848838 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.849519 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-log-httpd\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.850988 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-run-httpd\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.955140 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-config-data\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.955218 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jdr5\" (UniqueName: \"kubernetes.io/projected/77610d2a-c3c8-4e57-8631-c56e39176cc7-kube-api-access-7jdr5\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.957480 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.957584 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-log-httpd\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.958069 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-run-httpd\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.958264 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-scripts\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.958306 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.959797 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-log-httpd\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.965042 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-config-data\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.965252 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.968082 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-run-httpd\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.970169 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.971466 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-scripts\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:21 crc kubenswrapper[4702]: I1203 11:33:21.975493 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jdr5\" (UniqueName: \"kubernetes.io/projected/77610d2a-c3c8-4e57-8631-c56e39176cc7-kube-api-access-7jdr5\") pod \"ceilometer-0\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " pod="openstack/ceilometer-0" Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.118032 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.332862 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:33:22 crc kubenswrapper[4702]: W1203 11:33:22.368142 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6120d0cd_f573_4f9c_af92_77580df0916c.slice/crio-6916950e477fecf3db8ebfdce67d78c80246c6681772a6f1dbbbf88102636c34 WatchSource:0}: Error finding container 6916950e477fecf3db8ebfdce67d78c80246c6681772a6f1dbbbf88102636c34: Status 404 returned error can't find the container with id 6916950e477fecf3db8ebfdce67d78c80246c6681772a6f1dbbbf88102636c34 Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.370489 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.394112 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.430022 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6120d0cd-f573-4f9c-af92-77580df0916c","Type":"ContainerStarted","Data":"6916950e477fecf3db8ebfdce67d78c80246c6681772a6f1dbbbf88102636c34"} Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.452798 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerStarted","Data":"66abc7f2627c67b2b75db9c1003426301da000aa5c77c32e254f0ecf3eddc1de"} Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.452972 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-api" containerID="cri-o://2d1833a0368e77342fa5f6686847efaa7708f9c2d38570bb1a7f9036e9c6a75d" gracePeriod=30 Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.453016 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-listener" containerID="cri-o://66abc7f2627c67b2b75db9c1003426301da000aa5c77c32e254f0ecf3eddc1de" gracePeriod=30 Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.453034 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-evaluator" containerID="cri-o://209908441ff64d788b841146150f0e1ed97b1c2aecebd4447c3e03f092dcbc46" gracePeriod=30 Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.453043 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-notifier" containerID="cri-o://d91e72e8f66c9bd3e87823654f464d7cf4409cb8f4d86bcf9adb2e3321b7c30b" gracePeriod=30 Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.464667 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49564875-fe0f-45b7-b72f-f9583f4a80be","Type":"ContainerStarted","Data":"2d0aa5a9ac732c1b0cbd1d05f82f8631e3ce0d2a728ddd5cdbcb8d69a07e652d"} Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.484612 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91a66cfe-1f78-4bc2-854d-11003680f6ca","Type":"ContainerStarted","Data":"2816f77ac620499039e217cb06956fa16c7864c9eda02bbc6d7b59cf14289a55"} Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.515789 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.074826465 podStartE2EDuration="12.515714828s" podCreationTimestamp="2025-12-03 11:33:10 +0000 UTC" firstStartedPulling="2025-12-03 11:33:12.00132904 +0000 UTC m=+1775.837257514" lastFinishedPulling="2025-12-03 11:33:21.442217413 +0000 UTC m=+1785.278145877" observedRunningTime="2025-12-03 11:33:22.488684698 +0000 UTC m=+1786.324613162" watchObservedRunningTime="2025-12-03 11:33:22.515714828 +0000 UTC m=+1786.351643292" Dec 03 11:33:22 crc kubenswrapper[4702]: I1203 11:33:22.998593 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fefa4548-587a-4fd5-b49a-727241aa7c24" path="/var/lib/kubelet/pods/fefa4548-587a-4fd5-b49a-727241aa7c24/volumes" Dec 03 11:33:23 crc kubenswrapper[4702]: I1203 11:33:23.033623 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:23 crc kubenswrapper[4702]: I1203 11:33:23.552745 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerStarted","Data":"365dff481f0f7685aaa3ff626ad5d39db168a512821a6720eb9a1592fb408219"} Dec 03 11:33:23 crc kubenswrapper[4702]: I1203 11:33:23.558405 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91a66cfe-1f78-4bc2-854d-11003680f6ca","Type":"ContainerStarted","Data":"4e76021073dc4cde55830752f455be7fa7b9d5c261db44cce0d4efddd2d225bf"} Dec 03 11:33:23 crc kubenswrapper[4702]: I1203 11:33:23.565504 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6120d0cd-f573-4f9c-af92-77580df0916c","Type":"ContainerStarted","Data":"8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df"} Dec 03 11:33:23 crc kubenswrapper[4702]: I1203 11:33:23.577532 4702 generic.go:334] "Generic (PLEG): container finished" podID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerID="2d1833a0368e77342fa5f6686847efaa7708f9c2d38570bb1a7f9036e9c6a75d" exitCode=0 Dec 03 11:33:23 crc kubenswrapper[4702]: I1203 11:33:23.577682 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerDied","Data":"2d1833a0368e77342fa5f6686847efaa7708f9c2d38570bb1a7f9036e9c6a75d"} Dec 03 11:33:23 crc kubenswrapper[4702]: I1203 11:33:23.588201 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49564875-fe0f-45b7-b72f-f9583f4a80be","Type":"ContainerStarted","Data":"ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5"} Dec 03 11:33:23 crc kubenswrapper[4702]: I1203 11:33:23.624387 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.624362446 podStartE2EDuration="3.624362446s" podCreationTimestamp="2025-12-03 11:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:23.613313771 +0000 UTC m=+1787.449242235" watchObservedRunningTime="2025-12-03 11:33:23.624362446 +0000 UTC m=+1787.460290910" Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.473649 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.474551 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d7e1497f-e194-429b-add6-ee8e886fed8b" containerName="kube-state-metrics" containerID="cri-o://9a23dba3e4129629423cd8076d76daa7f2d38ac156b44dd7bcb69f84f91ae987" gracePeriod=30 Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.517331 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="d7e1497f-e194-429b-add6-ee8e886fed8b" containerName="kube-state-metrics" probeResult="failure" output="Get \"http://10.217.0.133:8081/readyz\": dial tcp 10.217.0.133:8081: connect: connection refused" Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.682821 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerStarted","Data":"bd15e94a3cba7557fd272305c022625d2ab1b922179db3403dfdb5e2aa806588"} Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.690366 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.690613 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="7b6e2ad9-6424-4857-8dc6-a67f7758151d" containerName="mysqld-exporter" containerID="cri-o://65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09" gracePeriod=30 Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.691528 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91a66cfe-1f78-4bc2-854d-11003680f6ca","Type":"ContainerStarted","Data":"908c6ae29e4601831cec06aa54d973978d1b4b264c09734f0ec062f753d8b827"} Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.695577 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6120d0cd-f573-4f9c-af92-77580df0916c","Type":"ContainerStarted","Data":"0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023"} Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.708117 4702 generic.go:334] "Generic (PLEG): container finished" podID="d7e1497f-e194-429b-add6-ee8e886fed8b" containerID="9a23dba3e4129629423cd8076d76daa7f2d38ac156b44dd7bcb69f84f91ae987" exitCode=2 Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.708181 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7e1497f-e194-429b-add6-ee8e886fed8b","Type":"ContainerDied","Data":"9a23dba3e4129629423cd8076d76daa7f2d38ac156b44dd7bcb69f84f91ae987"} Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.727837 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.727811035 podStartE2EDuration="4.727811035s" podCreationTimestamp="2025-12-03 11:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:24.720436305 +0000 UTC m=+1788.556364769" watchObservedRunningTime="2025-12-03 11:33:24.727811035 +0000 UTC m=+1788.563739499" Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.731870 4702 generic.go:334] "Generic (PLEG): container finished" podID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerID="d91e72e8f66c9bd3e87823654f464d7cf4409cb8f4d86bcf9adb2e3321b7c30b" exitCode=0 Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.732099 4702 generic.go:334] "Generic (PLEG): container finished" podID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerID="209908441ff64d788b841146150f0e1ed97b1c2aecebd4447c3e03f092dcbc46" exitCode=0 Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.733288 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerDied","Data":"d91e72e8f66c9bd3e87823654f464d7cf4409cb8f4d86bcf9adb2e3321b7c30b"} Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.733372 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerDied","Data":"209908441ff64d788b841146150f0e1ed97b1c2aecebd4447c3e03f092dcbc46"} Dec 03 11:33:24 crc kubenswrapper[4702]: I1203 11:33:24.766443 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.766417555 podStartE2EDuration="4.766417555s" podCreationTimestamp="2025-12-03 11:33:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:24.740530928 +0000 UTC m=+1788.576459392" watchObservedRunningTime="2025-12-03 11:33:24.766417555 +0000 UTC m=+1788.602346019" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.388923 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.587220 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h9k2\" (UniqueName: \"kubernetes.io/projected/d7e1497f-e194-429b-add6-ee8e886fed8b-kube-api-access-4h9k2\") pod \"d7e1497f-e194-429b-add6-ee8e886fed8b\" (UID: \"d7e1497f-e194-429b-add6-ee8e886fed8b\") " Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.602075 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7e1497f-e194-429b-add6-ee8e886fed8b-kube-api-access-4h9k2" (OuterVolumeSpecName: "kube-api-access-4h9k2") pod "d7e1497f-e194-429b-add6-ee8e886fed8b" (UID: "d7e1497f-e194-429b-add6-ee8e886fed8b"). InnerVolumeSpecName "kube-api-access-4h9k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.690519 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h9k2\" (UniqueName: \"kubernetes.io/projected/d7e1497f-e194-429b-add6-ee8e886fed8b-kube-api-access-4h9k2\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.758457 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.784296 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerStarted","Data":"a301643852ec8c0f1f5ae92ce4402cd706498370c3646d5c6cb672540d2d2af8"} Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.827055 4702 generic.go:334] "Generic (PLEG): container finished" podID="7b6e2ad9-6424-4857-8dc6-a67f7758151d" containerID="65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09" exitCode=2 Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.827166 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7b6e2ad9-6424-4857-8dc6-a67f7758151d","Type":"ContainerDied","Data":"65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09"} Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.827202 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"7b6e2ad9-6424-4857-8dc6-a67f7758151d","Type":"ContainerDied","Data":"9fbe706efb965354fb4447398a77d679fdc2f5b4ae35dccea5ea498823352030"} Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.827242 4702 scope.go:117] "RemoveContainer" containerID="65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.827477 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.853167 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.857869 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d7e1497f-e194-429b-add6-ee8e886fed8b","Type":"ContainerDied","Data":"f93ae10486e3658f980820dffa535cc38d2156232b852825a26939702f9ce0dc"} Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.919126 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-config-data\") pod \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.919586 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd7jp\" (UniqueName: \"kubernetes.io/projected/7b6e2ad9-6424-4857-8dc6-a67f7758151d-kube-api-access-wd7jp\") pod \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.919734 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-combined-ca-bundle\") pod \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\" (UID: \"7b6e2ad9-6424-4857-8dc6-a67f7758151d\") " Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.936986 4702 scope.go:117] "RemoveContainer" containerID="65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09" Dec 03 11:33:25 crc kubenswrapper[4702]: E1203 11:33:25.943192 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09\": container with ID starting with 65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09 not found: ID does not exist" containerID="65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.943250 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09"} err="failed to get container status \"65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09\": rpc error: code = NotFound desc = could not find container \"65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09\": container with ID starting with 65e88a36ec60dd090abb29138f8855e6623630aecc65d8e73f2d3c3d6e0e0e09 not found: ID does not exist" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.943293 4702 scope.go:117] "RemoveContainer" containerID="9a23dba3e4129629423cd8076d76daa7f2d38ac156b44dd7bcb69f84f91ae987" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.961869 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b6e2ad9-6424-4857-8dc6-a67f7758151d-kube-api-access-wd7jp" (OuterVolumeSpecName: "kube-api-access-wd7jp") pod "7b6e2ad9-6424-4857-8dc6-a67f7758151d" (UID: "7b6e2ad9-6424-4857-8dc6-a67f7758151d"). InnerVolumeSpecName "kube-api-access-wd7jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:25 crc kubenswrapper[4702]: I1203 11:33:25.962257 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.008417 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.023641 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd7jp\" (UniqueName: \"kubernetes.io/projected/7b6e2ad9-6424-4857-8dc6-a67f7758151d-kube-api-access-wd7jp\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.041192 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b6e2ad9-6424-4857-8dc6-a67f7758151d" (UID: "7b6e2ad9-6424-4857-8dc6-a67f7758151d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.053313 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:33:26 crc kubenswrapper[4702]: E1203 11:33:26.054107 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7e1497f-e194-429b-add6-ee8e886fed8b" containerName="kube-state-metrics" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.054132 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7e1497f-e194-429b-add6-ee8e886fed8b" containerName="kube-state-metrics" Dec 03 11:33:26 crc kubenswrapper[4702]: E1203 11:33:26.054172 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b6e2ad9-6424-4857-8dc6-a67f7758151d" containerName="mysqld-exporter" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.054180 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b6e2ad9-6424-4857-8dc6-a67f7758151d" containerName="mysqld-exporter" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.054482 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7e1497f-e194-429b-add6-ee8e886fed8b" containerName="kube-state-metrics" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.054502 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b6e2ad9-6424-4857-8dc6-a67f7758151d" containerName="mysqld-exporter" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.055722 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.061555 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.062029 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.062335 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-config-data" (OuterVolumeSpecName: "config-data") pod "7b6e2ad9-6424-4857-8dc6-a67f7758151d" (UID: "7b6e2ad9-6424-4857-8dc6-a67f7758151d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.080234 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.125656 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.126005 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b6e2ad9-6424-4857-8dc6-a67f7758151d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.169909 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.203734 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.227872 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.227957 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.228083 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.228124 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v9zf\" (UniqueName: \"kubernetes.io/projected/30059ea4-152f-420c-b8cc-234ebab96b47-kube-api-access-5v9zf\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.236892 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.250931 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.251009 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.251541 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.255287 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.258527 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.266158 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.266417 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.334727 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.335062 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.338862 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.338990 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v9zf\" (UniqueName: \"kubernetes.io/projected/30059ea4-152f-420c-b8cc-234ebab96b47-kube-api-access-5v9zf\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.339797 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.340740 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.354429 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30059ea4-152f-420c-b8cc-234ebab96b47-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.367077 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v9zf\" (UniqueName: \"kubernetes.io/projected/30059ea4-152f-420c-b8cc-234ebab96b47-kube-api-access-5v9zf\") pod \"kube-state-metrics-0\" (UID: \"30059ea4-152f-420c-b8cc-234ebab96b47\") " pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.390038 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.461247 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-config-data\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.461299 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8fpf\" (UniqueName: \"kubernetes.io/projected/941c2f85-794d-4361-942f-1d264fb98b7d-kube-api-access-j8fpf\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.461331 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.461416 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.564438 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-config-data\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.564744 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8fpf\" (UniqueName: \"kubernetes.io/projected/941c2f85-794d-4361-942f-1d264fb98b7d-kube-api-access-j8fpf\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.564806 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.564911 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.572531 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.578600 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.591516 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941c2f85-794d-4361-942f-1d264fb98b7d-config-data\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.594090 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8fpf\" (UniqueName: \"kubernetes.io/projected/941c2f85-794d-4361-942f-1d264fb98b7d-kube-api-access-j8fpf\") pod \"mysqld-exporter-0\" (UID: \"941c2f85-794d-4361-942f-1d264fb98b7d\") " pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.601122 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.883937 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerStarted","Data":"c0dfe92078ead359d24d0ef8006aa0c05a5784b60bbb3f612951b22c2e488593"} Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.957492 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b6e2ad9-6424-4857-8dc6-a67f7758151d" path="/var/lib/kubelet/pods/7b6e2ad9-6424-4857-8dc6-a67f7758151d/volumes" Dec 03 11:33:26 crc kubenswrapper[4702]: I1203 11:33:26.958183 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7e1497f-e194-429b-add6-ee8e886fed8b" path="/var/lib/kubelet/pods/d7e1497f-e194-429b-add6-ee8e886fed8b/volumes" Dec 03 11:33:27 crc kubenswrapper[4702]: I1203 11:33:27.018440 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 11:33:27 crc kubenswrapper[4702]: I1203 11:33:27.300645 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Dec 03 11:33:27 crc kubenswrapper[4702]: W1203 11:33:27.309572 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod941c2f85_794d_4361_942f_1d264fb98b7d.slice/crio-ab36a7142552fc75c89f529505b133be97c2378d013517d704bdc62dd2742137 WatchSource:0}: Error finding container ab36a7142552fc75c89f529505b133be97c2378d013517d704bdc62dd2742137: Status 404 returned error can't find the container with id ab36a7142552fc75c89f529505b133be97c2378d013517d704bdc62dd2742137 Dec 03 11:33:27 crc kubenswrapper[4702]: I1203 11:33:27.932679 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"941c2f85-794d-4361-942f-1d264fb98b7d","Type":"ContainerStarted","Data":"ab36a7142552fc75c89f529505b133be97c2378d013517d704bdc62dd2742137"} Dec 03 11:33:27 crc kubenswrapper[4702]: I1203 11:33:27.942797 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerStarted","Data":"8a1b96c750ed6a61e56256c7c34da4bbee68db91044150676a65861a6b502e1b"} Dec 03 11:33:27 crc kubenswrapper[4702]: I1203 11:33:27.944709 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:33:27 crc kubenswrapper[4702]: I1203 11:33:27.951952 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"30059ea4-152f-420c-b8cc-234ebab96b47","Type":"ContainerStarted","Data":"cfa8941d3a69395a9cc1216ecb78f13f25c59fdf8000e25c72f011713b129335"} Dec 03 11:33:27 crc kubenswrapper[4702]: I1203 11:33:27.952070 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 11:33:27 crc kubenswrapper[4702]: I1203 11:33:27.952088 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"30059ea4-152f-420c-b8cc-234ebab96b47","Type":"ContainerStarted","Data":"5abc0b8093cff300a1aa43096fa28f67bb4ca36195eb52d6fb6fb2aea620d65b"} Dec 03 11:33:28 crc kubenswrapper[4702]: I1203 11:33:28.061476 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.808739978 podStartE2EDuration="7.061440516s" podCreationTimestamp="2025-12-03 11:33:21 +0000 UTC" firstStartedPulling="2025-12-03 11:33:23.020031873 +0000 UTC m=+1786.855960337" lastFinishedPulling="2025-12-03 11:33:27.272732411 +0000 UTC m=+1791.108660875" observedRunningTime="2025-12-03 11:33:28.030241687 +0000 UTC m=+1791.866170171" watchObservedRunningTime="2025-12-03 11:33:28.061440516 +0000 UTC m=+1791.897368980" Dec 03 11:33:28 crc kubenswrapper[4702]: I1203 11:33:28.072400 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.494372135 podStartE2EDuration="3.072377188s" podCreationTimestamp="2025-12-03 11:33:25 +0000 UTC" firstStartedPulling="2025-12-03 11:33:27.02200361 +0000 UTC m=+1790.857932074" lastFinishedPulling="2025-12-03 11:33:27.600008663 +0000 UTC m=+1791.435937127" observedRunningTime="2025-12-03 11:33:28.059707277 +0000 UTC m=+1791.895635741" watchObservedRunningTime="2025-12-03 11:33:28.072377188 +0000 UTC m=+1791.908305652" Dec 03 11:33:28 crc kubenswrapper[4702]: I1203 11:33:28.530737 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:29 crc kubenswrapper[4702]: I1203 11:33:29.400097 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 11:33:29 crc kubenswrapper[4702]: I1203 11:33:29.984137 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"941c2f85-794d-4361-942f-1d264fb98b7d","Type":"ContainerStarted","Data":"338794213f2a4d3999f6f9d6ba69d8fa6fa608e41ca6596a3008d33695eb8a3b"} Dec 03 11:33:29 crc kubenswrapper[4702]: I1203 11:33:29.984461 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="sg-core" containerID="cri-o://c0dfe92078ead359d24d0ef8006aa0c05a5784b60bbb3f612951b22c2e488593" gracePeriod=30 Dec 03 11:33:29 crc kubenswrapper[4702]: I1203 11:33:29.984523 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="ceilometer-notification-agent" containerID="cri-o://a301643852ec8c0f1f5ae92ce4402cd706498370c3646d5c6cb672540d2d2af8" gracePeriod=30 Dec 03 11:33:29 crc kubenswrapper[4702]: I1203 11:33:29.984461 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="proxy-httpd" containerID="cri-o://8a1b96c750ed6a61e56256c7c34da4bbee68db91044150676a65861a6b502e1b" gracePeriod=30 Dec 03 11:33:29 crc kubenswrapper[4702]: I1203 11:33:29.984453 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="ceilometer-central-agent" containerID="cri-o://bd15e94a3cba7557fd272305c022625d2ab1b922179db3403dfdb5e2aa806588" gracePeriod=30 Dec 03 11:33:30 crc kubenswrapper[4702]: I1203 11:33:30.014241 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.222235588 podStartE2EDuration="4.014215076s" podCreationTimestamp="2025-12-03 11:33:26 +0000 UTC" firstStartedPulling="2025-12-03 11:33:27.317877957 +0000 UTC m=+1791.153806421" lastFinishedPulling="2025-12-03 11:33:28.109857445 +0000 UTC m=+1791.945785909" observedRunningTime="2025-12-03 11:33:30.011381505 +0000 UTC m=+1793.847309969" watchObservedRunningTime="2025-12-03 11:33:30.014215076 +0000 UTC m=+1793.850143540" Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.003886 4702 generic.go:334] "Generic (PLEG): container finished" podID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerID="8a1b96c750ed6a61e56256c7c34da4bbee68db91044150676a65861a6b502e1b" exitCode=0 Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.005098 4702 generic.go:334] "Generic (PLEG): container finished" podID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerID="c0dfe92078ead359d24d0ef8006aa0c05a5784b60bbb3f612951b22c2e488593" exitCode=2 Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.005198 4702 generic.go:334] "Generic (PLEG): container finished" podID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerID="a301643852ec8c0f1f5ae92ce4402cd706498370c3646d5c6cb672540d2d2af8" exitCode=0 Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.003990 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerDied","Data":"8a1b96c750ed6a61e56256c7c34da4bbee68db91044150676a65861a6b502e1b"} Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.005962 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerDied","Data":"c0dfe92078ead359d24d0ef8006aa0c05a5784b60bbb3f612951b22c2e488593"} Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.005998 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerDied","Data":"a301643852ec8c0f1f5ae92ce4402cd706498370c3646d5c6cb672540d2d2af8"} Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.146831 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.147225 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.169775 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.207172 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.251311 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 11:33:31 crc kubenswrapper[4702]: I1203 11:33:31.251386 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 11:33:32 crc kubenswrapper[4702]: I1203 11:33:32.067546 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 11:33:32 crc kubenswrapper[4702]: I1203 11:33:32.229013 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:33:32 crc kubenswrapper[4702]: I1203 11:33:32.229175 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:33:32 crc kubenswrapper[4702]: I1203 11:33:32.265996 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 11:33:32 crc kubenswrapper[4702]: I1203 11:33:32.265996 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.249:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.053328 4702 generic.go:334] "Generic (PLEG): container finished" podID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerID="bd15e94a3cba7557fd272305c022625d2ab1b922179db3403dfdb5e2aa806588" exitCode=0 Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.054594 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerDied","Data":"bd15e94a3cba7557fd272305c022625d2ab1b922179db3403dfdb5e2aa806588"} Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.659714 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.846419 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-sg-core-conf-yaml\") pod \"77610d2a-c3c8-4e57-8631-c56e39176cc7\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.847014 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-log-httpd\") pod \"77610d2a-c3c8-4e57-8631-c56e39176cc7\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.847104 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jdr5\" (UniqueName: \"kubernetes.io/projected/77610d2a-c3c8-4e57-8631-c56e39176cc7-kube-api-access-7jdr5\") pod \"77610d2a-c3c8-4e57-8631-c56e39176cc7\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.847196 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-scripts\") pod \"77610d2a-c3c8-4e57-8631-c56e39176cc7\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.847329 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-config-data\") pod \"77610d2a-c3c8-4e57-8631-c56e39176cc7\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.847425 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-combined-ca-bundle\") pod \"77610d2a-c3c8-4e57-8631-c56e39176cc7\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.847510 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-run-httpd\") pod \"77610d2a-c3c8-4e57-8631-c56e39176cc7\" (UID: \"77610d2a-c3c8-4e57-8631-c56e39176cc7\") " Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.848631 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77610d2a-c3c8-4e57-8631-c56e39176cc7" (UID: "77610d2a-c3c8-4e57-8631-c56e39176cc7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.848681 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77610d2a-c3c8-4e57-8631-c56e39176cc7" (UID: "77610d2a-c3c8-4e57-8631-c56e39176cc7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.861811 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77610d2a-c3c8-4e57-8631-c56e39176cc7-kube-api-access-7jdr5" (OuterVolumeSpecName: "kube-api-access-7jdr5") pod "77610d2a-c3c8-4e57-8631-c56e39176cc7" (UID: "77610d2a-c3c8-4e57-8631-c56e39176cc7"). InnerVolumeSpecName "kube-api-access-7jdr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.868006 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-scripts" (OuterVolumeSpecName: "scripts") pod "77610d2a-c3c8-4e57-8631-c56e39176cc7" (UID: "77610d2a-c3c8-4e57-8631-c56e39176cc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.893876 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "77610d2a-c3c8-4e57-8631-c56e39176cc7" (UID: "77610d2a-c3c8-4e57-8631-c56e39176cc7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.951187 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.951228 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.951243 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.951257 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77610d2a-c3c8-4e57-8631-c56e39176cc7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.951269 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jdr5\" (UniqueName: \"kubernetes.io/projected/77610d2a-c3c8-4e57-8631-c56e39176cc7-kube-api-access-7jdr5\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:33 crc kubenswrapper[4702]: I1203 11:33:33.973489 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "77610d2a-c3c8-4e57-8631-c56e39176cc7" (UID: "77610d2a-c3c8-4e57-8631-c56e39176cc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.035903 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-config-data" (OuterVolumeSpecName: "config-data") pod "77610d2a-c3c8-4e57-8631-c56e39176cc7" (UID: "77610d2a-c3c8-4e57-8631-c56e39176cc7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.054266 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.054299 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77610d2a-c3c8-4e57-8631-c56e39176cc7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.081429 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"77610d2a-c3c8-4e57-8631-c56e39176cc7","Type":"ContainerDied","Data":"365dff481f0f7685aaa3ff626ad5d39db168a512821a6720eb9a1592fb408219"} Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.081524 4702 scope.go:117] "RemoveContainer" containerID="8a1b96c750ed6a61e56256c7c34da4bbee68db91044150676a65861a6b502e1b" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.081815 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.185572 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.217847 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.240236 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:34 crc kubenswrapper[4702]: E1203 11:33:34.240944 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="sg-core" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.240967 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="sg-core" Dec 03 11:33:34 crc kubenswrapper[4702]: E1203 11:33:34.240985 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="proxy-httpd" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.240991 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="proxy-httpd" Dec 03 11:33:34 crc kubenswrapper[4702]: E1203 11:33:34.241010 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="ceilometer-notification-agent" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.241017 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="ceilometer-notification-agent" Dec 03 11:33:34 crc kubenswrapper[4702]: E1203 11:33:34.241039 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="ceilometer-central-agent" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.241045 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="ceilometer-central-agent" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.241346 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="proxy-httpd" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.241364 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="sg-core" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.241380 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="ceilometer-central-agent" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.241396 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" containerName="ceilometer-notification-agent" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.244090 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.247312 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.313028 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-scripts\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.313544 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-config-data\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.313629 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.313659 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k566r\" (UniqueName: \"kubernetes.io/projected/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-kube-api-access-k566r\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.313877 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-run-httpd\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.313926 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-log-httpd\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.314200 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.314369 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.314411 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.315737 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.316017 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.335971 4702 scope.go:117] "RemoveContainer" containerID="c0dfe92078ead359d24d0ef8006aa0c05a5784b60bbb3f612951b22c2e488593" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.368456 4702 scope.go:117] "RemoveContainer" containerID="a301643852ec8c0f1f5ae92ce4402cd706498370c3646d5c6cb672540d2d2af8" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.416688 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-scripts\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.416811 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-config-data\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.416889 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.416922 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k566r\" (UniqueName: \"kubernetes.io/projected/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-kube-api-access-k566r\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.417019 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-run-httpd\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.417049 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-log-httpd\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.417126 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.417200 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.418083 4702 scope.go:117] "RemoveContainer" containerID="bd15e94a3cba7557fd272305c022625d2ab1b922179db3403dfdb5e2aa806588" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.419230 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-run-httpd\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.420683 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-log-httpd\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.435177 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.435188 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-config-data\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.436296 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-scripts\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.438662 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k566r\" (UniqueName: \"kubernetes.io/projected/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-kube-api-access-k566r\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.441880 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.444441 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.637853 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.928466 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:33:34 crc kubenswrapper[4702]: E1203 11:33:34.928816 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:33:34 crc kubenswrapper[4702]: I1203 11:33:34.944892 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77610d2a-c3c8-4e57-8631-c56e39176cc7" path="/var/lib/kubelet/pods/77610d2a-c3c8-4e57-8631-c56e39176cc7/volumes" Dec 03 11:33:35 crc kubenswrapper[4702]: W1203 11:33:35.194883 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42c1ddbc_9b3d_4310_a6ec_dd5416411e40.slice/crio-9f5d99c55ae69153ca9f8897f7da6d040292d02e93865626259ffa85941ceb90 WatchSource:0}: Error finding container 9f5d99c55ae69153ca9f8897f7da6d040292d02e93865626259ffa85941ceb90: Status 404 returned error can't find the container with id 9f5d99c55ae69153ca9f8897f7da6d040292d02e93865626259ffa85941ceb90 Dec 03 11:33:35 crc kubenswrapper[4702]: I1203 11:33:35.203020 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:36 crc kubenswrapper[4702]: I1203 11:33:36.136662 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerStarted","Data":"910080b47494f0dd52d11b09a3a5070f65dd603a4d6c20720f0e4ff979809a87"} Dec 03 11:33:36 crc kubenswrapper[4702]: I1203 11:33:36.137053 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerStarted","Data":"9f5d99c55ae69153ca9f8897f7da6d040292d02e93865626259ffa85941ceb90"} Dec 03 11:33:36 crc kubenswrapper[4702]: I1203 11:33:36.407284 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 11:33:37 crc kubenswrapper[4702]: I1203 11:33:37.170597 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerStarted","Data":"7a9101dcea969ca73bc30f513e5a357f4fba936c1db39427c58951d34f7c4c62"} Dec 03 11:33:38 crc kubenswrapper[4702]: I1203 11:33:38.185495 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerStarted","Data":"572d9abc35efa525a77679300431c9690c187dee8d13c0bcd03970e95c31f82d"} Dec 03 11:33:39 crc kubenswrapper[4702]: I1203 11:33:39.203436 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerStarted","Data":"d1efc8168a5b99e4dd80ffa460d3781d6ed6e22d497ba17e38c2d4361557a7db"} Dec 03 11:33:39 crc kubenswrapper[4702]: I1203 11:33:39.204284 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:33:39 crc kubenswrapper[4702]: I1203 11:33:39.238411 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.711920533 podStartE2EDuration="5.238383336s" podCreationTimestamp="2025-12-03 11:33:34 +0000 UTC" firstStartedPulling="2025-12-03 11:33:35.200197917 +0000 UTC m=+1799.036126381" lastFinishedPulling="2025-12-03 11:33:38.72666072 +0000 UTC m=+1802.562589184" observedRunningTime="2025-12-03 11:33:39.227965689 +0000 UTC m=+1803.063894153" watchObservedRunningTime="2025-12-03 11:33:39.238383336 +0000 UTC m=+1803.074311810" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.110644 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.219239 4702 generic.go:334] "Generic (PLEG): container finished" podID="f9c35933-cc2c-459b-a47f-90397aa62811" containerID="701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6" exitCode=137 Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.219322 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.219324 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9c35933-cc2c-459b-a47f-90397aa62811","Type":"ContainerDied","Data":"701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6"} Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.219519 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9c35933-cc2c-459b-a47f-90397aa62811","Type":"ContainerDied","Data":"0346a8993c1a8581c0cff99c353599482bc262307bf9c5588294ea1f8c7a97d2"} Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.219566 4702 scope.go:117] "RemoveContainer" containerID="701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.244458 4702 scope.go:117] "RemoveContainer" containerID="701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6" Dec 03 11:33:40 crc kubenswrapper[4702]: E1203 11:33:40.244937 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6\": container with ID starting with 701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6 not found: ID does not exist" containerID="701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.244972 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6"} err="failed to get container status \"701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6\": rpc error: code = NotFound desc = could not find container \"701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6\": container with ID starting with 701501a6c82d46f881831fd896571879ad22f149e70585221ef2ebb51b3853b6 not found: ID does not exist" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.288939 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-combined-ca-bundle\") pod \"f9c35933-cc2c-459b-a47f-90397aa62811\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.289165 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np2fl\" (UniqueName: \"kubernetes.io/projected/f9c35933-cc2c-459b-a47f-90397aa62811-kube-api-access-np2fl\") pod \"f9c35933-cc2c-459b-a47f-90397aa62811\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.289229 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-config-data\") pod \"f9c35933-cc2c-459b-a47f-90397aa62811\" (UID: \"f9c35933-cc2c-459b-a47f-90397aa62811\") " Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.295929 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c35933-cc2c-459b-a47f-90397aa62811-kube-api-access-np2fl" (OuterVolumeSpecName: "kube-api-access-np2fl") pod "f9c35933-cc2c-459b-a47f-90397aa62811" (UID: "f9c35933-cc2c-459b-a47f-90397aa62811"). InnerVolumeSpecName "kube-api-access-np2fl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.322570 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-config-data" (OuterVolumeSpecName: "config-data") pod "f9c35933-cc2c-459b-a47f-90397aa62811" (UID: "f9c35933-cc2c-459b-a47f-90397aa62811"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.332354 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9c35933-cc2c-459b-a47f-90397aa62811" (UID: "f9c35933-cc2c-459b-a47f-90397aa62811"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.393342 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.393388 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np2fl\" (UniqueName: \"kubernetes.io/projected/f9c35933-cc2c-459b-a47f-90397aa62811-kube-api-access-np2fl\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.393404 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c35933-cc2c-459b-a47f-90397aa62811-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.562646 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.577608 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.606596 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:33:40 crc kubenswrapper[4702]: E1203 11:33:40.607816 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c35933-cc2c-459b-a47f-90397aa62811" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.607867 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c35933-cc2c-459b-a47f-90397aa62811" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.612306 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c35933-cc2c-459b-a47f-90397aa62811" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.614458 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.626179 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.626567 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.626680 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.644022 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.712164 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.712448 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.712595 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tr92\" (UniqueName: \"kubernetes.io/projected/ee5b7802-3239-45f9-9b0b-99348615d8bd-kube-api-access-2tr92\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.712660 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.712698 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.814952 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.815011 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.815158 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.815269 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.815364 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tr92\" (UniqueName: \"kubernetes.io/projected/ee5b7802-3239-45f9-9b0b-99348615d8bd-kube-api-access-2tr92\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.821138 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.822095 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.823157 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.834729 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee5b7802-3239-45f9-9b0b-99348615d8bd-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.840015 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tr92\" (UniqueName: \"kubernetes.io/projected/ee5b7802-3239-45f9-9b0b-99348615d8bd-kube-api-access-2tr92\") pod \"nova-cell1-novncproxy-0\" (UID: \"ee5b7802-3239-45f9-9b0b-99348615d8bd\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.942583 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c35933-cc2c-459b-a47f-90397aa62811" path="/var/lib/kubelet/pods/f9c35933-cc2c-459b-a47f-90397aa62811/volumes" Dec 03 11:33:40 crc kubenswrapper[4702]: I1203 11:33:40.964200 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.158043 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.159350 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.159398 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.166193 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.240222 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.243809 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.257226 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.259222 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.276493 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.489690 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv"] Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.492254 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.521987 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv"] Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.555915 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.637927 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-config\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.638693 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.638939 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.639275 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw9bj\" (UniqueName: \"kubernetes.io/projected/da604ff7-8464-439e-aa94-29102f336add-kube-api-access-hw9bj\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.639442 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.639575 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.742703 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw9bj\" (UniqueName: \"kubernetes.io/projected/da604ff7-8464-439e-aa94-29102f336add-kube-api-access-hw9bj\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.742830 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.742907 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.743010 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-config\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.743163 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.743222 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.744233 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.744372 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.744623 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-config\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.744661 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.744993 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:41 crc kubenswrapper[4702]: I1203 11:33:41.767869 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw9bj\" (UniqueName: \"kubernetes.io/projected/da604ff7-8464-439e-aa94-29102f336add-kube-api-access-hw9bj\") pod \"dnsmasq-dns-6b7bbf7cf9-5lrzv\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:42 crc kubenswrapper[4702]: I1203 11:33:42.014121 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:42 crc kubenswrapper[4702]: I1203 11:33:42.267926 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ee5b7802-3239-45f9-9b0b-99348615d8bd","Type":"ContainerStarted","Data":"93d969c6f4571436e6b5de09a227ccc8a1a83732d86a1be63c01c124f8da2e53"} Dec 03 11:33:42 crc kubenswrapper[4702]: I1203 11:33:42.267982 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ee5b7802-3239-45f9-9b0b-99348615d8bd","Type":"ContainerStarted","Data":"68c488353931fc1cd97c36d181757bb7c4be79aab2f5960dd6fef29b55b62d80"} Dec 03 11:33:42 crc kubenswrapper[4702]: I1203 11:33:42.285869 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 11:33:42 crc kubenswrapper[4702]: I1203 11:33:42.300591 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.300557095 podStartE2EDuration="2.300557095s" podCreationTimestamp="2025-12-03 11:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:42.294702528 +0000 UTC m=+1806.130631002" watchObservedRunningTime="2025-12-03 11:33:42.300557095 +0000 UTC m=+1806.136485559" Dec 03 11:33:42 crc kubenswrapper[4702]: I1203 11:33:42.651875 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv"] Dec 03 11:33:43 crc kubenswrapper[4702]: I1203 11:33:43.288635 4702 generic.go:334] "Generic (PLEG): container finished" podID="da604ff7-8464-439e-aa94-29102f336add" containerID="99fd457c22499203cf340d5df7ae15f01cea3cefd4b6e64000df4d996eb41087" exitCode=0 Dec 03 11:33:43 crc kubenswrapper[4702]: I1203 11:33:43.291293 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" event={"ID":"da604ff7-8464-439e-aa94-29102f336add","Type":"ContainerDied","Data":"99fd457c22499203cf340d5df7ae15f01cea3cefd4b6e64000df4d996eb41087"} Dec 03 11:33:43 crc kubenswrapper[4702]: I1203 11:33:43.291339 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" event={"ID":"da604ff7-8464-439e-aa94-29102f336add","Type":"ContainerStarted","Data":"4fffbcaa959c66ae4a050c3b7ef0d782b7d2e299a7943a9328de6fb6c4156ad6"} Dec 03 11:33:44 crc kubenswrapper[4702]: I1203 11:33:44.310400 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" event={"ID":"da604ff7-8464-439e-aa94-29102f336add","Type":"ContainerStarted","Data":"1046b22c92fb7df7df83779c6c50340fcca3f5911225e5e30da75662ab6b3b4a"} Dec 03 11:33:44 crc kubenswrapper[4702]: I1203 11:33:44.312003 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:44 crc kubenswrapper[4702]: I1203 11:33:44.373295 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" podStartSLOduration=3.373270271 podStartE2EDuration="3.373270271s" podCreationTimestamp="2025-12-03 11:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:44.362241576 +0000 UTC m=+1808.198170040" watchObservedRunningTime="2025-12-03 11:33:44.373270271 +0000 UTC m=+1808.209198725" Dec 03 11:33:44 crc kubenswrapper[4702]: I1203 11:33:44.604134 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:44 crc kubenswrapper[4702]: I1203 11:33:44.604819 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-log" containerID="cri-o://4e76021073dc4cde55830752f455be7fa7b9d5c261db44cce0d4efddd2d225bf" gracePeriod=30 Dec 03 11:33:44 crc kubenswrapper[4702]: I1203 11:33:44.604894 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-api" containerID="cri-o://908c6ae29e4601831cec06aa54d973978d1b4b264c09734f0ec062f753d8b827" gracePeriod=30 Dec 03 11:33:45 crc kubenswrapper[4702]: I1203 11:33:45.324045 4702 generic.go:334] "Generic (PLEG): container finished" podID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerID="4e76021073dc4cde55830752f455be7fa7b9d5c261db44cce0d4efddd2d225bf" exitCode=143 Dec 03 11:33:45 crc kubenswrapper[4702]: I1203 11:33:45.324077 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91a66cfe-1f78-4bc2-854d-11003680f6ca","Type":"ContainerDied","Data":"4e76021073dc4cde55830752f455be7fa7b9d5c261db44cce0d4efddd2d225bf"} Dec 03 11:33:45 crc kubenswrapper[4702]: I1203 11:33:45.801833 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:45 crc kubenswrapper[4702]: I1203 11:33:45.802506 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="ceilometer-central-agent" containerID="cri-o://910080b47494f0dd52d11b09a3a5070f65dd603a4d6c20720f0e4ff979809a87" gracePeriod=30 Dec 03 11:33:45 crc kubenswrapper[4702]: I1203 11:33:45.802662 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="proxy-httpd" containerID="cri-o://d1efc8168a5b99e4dd80ffa460d3781d6ed6e22d497ba17e38c2d4361557a7db" gracePeriod=30 Dec 03 11:33:45 crc kubenswrapper[4702]: I1203 11:33:45.802777 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="ceilometer-notification-agent" containerID="cri-o://7a9101dcea969ca73bc30f513e5a357f4fba936c1db39427c58951d34f7c4c62" gracePeriod=30 Dec 03 11:33:45 crc kubenswrapper[4702]: I1203 11:33:45.802941 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="sg-core" containerID="cri-o://572d9abc35efa525a77679300431c9690c187dee8d13c0bcd03970e95c31f82d" gracePeriod=30 Dec 03 11:33:45 crc kubenswrapper[4702]: I1203 11:33:45.964550 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.346705 4702 generic.go:334] "Generic (PLEG): container finished" podID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerID="d1efc8168a5b99e4dd80ffa460d3781d6ed6e22d497ba17e38c2d4361557a7db" exitCode=0 Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.346748 4702 generic.go:334] "Generic (PLEG): container finished" podID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerID="572d9abc35efa525a77679300431c9690c187dee8d13c0bcd03970e95c31f82d" exitCode=2 Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.346776 4702 generic.go:334] "Generic (PLEG): container finished" podID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerID="7a9101dcea969ca73bc30f513e5a357f4fba936c1db39427c58951d34f7c4c62" exitCode=0 Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.346784 4702 generic.go:334] "Generic (PLEG): container finished" podID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerID="910080b47494f0dd52d11b09a3a5070f65dd603a4d6c20720f0e4ff979809a87" exitCode=0 Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.346838 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerDied","Data":"d1efc8168a5b99e4dd80ffa460d3781d6ed6e22d497ba17e38c2d4361557a7db"} Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.346921 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerDied","Data":"572d9abc35efa525a77679300431c9690c187dee8d13c0bcd03970e95c31f82d"} Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.346936 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerDied","Data":"7a9101dcea969ca73bc30f513e5a357f4fba936c1db39427c58951d34f7c4c62"} Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.346946 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerDied","Data":"910080b47494f0dd52d11b09a3a5070f65dd603a4d6c20720f0e4ff979809a87"} Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.813882 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.905700 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-ceilometer-tls-certs\") pod \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.905904 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-run-httpd\") pod \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.905942 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-log-httpd\") pod \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.906061 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-config-data\") pod \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.906098 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-combined-ca-bundle\") pod \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.906244 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-sg-core-conf-yaml\") pod \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.906334 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k566r\" (UniqueName: \"kubernetes.io/projected/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-kube-api-access-k566r\") pod \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.906374 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-scripts\") pod \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\" (UID: \"42c1ddbc-9b3d-4310-a6ec-dd5416411e40\") " Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.906653 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "42c1ddbc-9b3d-4310-a6ec-dd5416411e40" (UID: "42c1ddbc-9b3d-4310-a6ec-dd5416411e40"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.906874 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "42c1ddbc-9b3d-4310-a6ec-dd5416411e40" (UID: "42c1ddbc-9b3d-4310-a6ec-dd5416411e40"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.907683 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.907702 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.913312 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-kube-api-access-k566r" (OuterVolumeSpecName: "kube-api-access-k566r") pod "42c1ddbc-9b3d-4310-a6ec-dd5416411e40" (UID: "42c1ddbc-9b3d-4310-a6ec-dd5416411e40"). InnerVolumeSpecName "kube-api-access-k566r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.913508 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-scripts" (OuterVolumeSpecName: "scripts") pod "42c1ddbc-9b3d-4310-a6ec-dd5416411e40" (UID: "42c1ddbc-9b3d-4310-a6ec-dd5416411e40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:46 crc kubenswrapper[4702]: I1203 11:33:46.954116 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "42c1ddbc-9b3d-4310-a6ec-dd5416411e40" (UID: "42c1ddbc-9b3d-4310-a6ec-dd5416411e40"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.010793 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k566r\" (UniqueName: \"kubernetes.io/projected/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-kube-api-access-k566r\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.010834 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.010847 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.018744 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42c1ddbc-9b3d-4310-a6ec-dd5416411e40" (UID: "42c1ddbc-9b3d-4310-a6ec-dd5416411e40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.024835 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "42c1ddbc-9b3d-4310-a6ec-dd5416411e40" (UID: "42c1ddbc-9b3d-4310-a6ec-dd5416411e40"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.072966 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-config-data" (OuterVolumeSpecName: "config-data") pod "42c1ddbc-9b3d-4310-a6ec-dd5416411e40" (UID: "42c1ddbc-9b3d-4310-a6ec-dd5416411e40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.113139 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.113189 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.113207 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/42c1ddbc-9b3d-4310-a6ec-dd5416411e40-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.367034 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"42c1ddbc-9b3d-4310-a6ec-dd5416411e40","Type":"ContainerDied","Data":"9f5d99c55ae69153ca9f8897f7da6d040292d02e93865626259ffa85941ceb90"} Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.367112 4702 scope.go:117] "RemoveContainer" containerID="d1efc8168a5b99e4dd80ffa460d3781d6ed6e22d497ba17e38c2d4361557a7db" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.367345 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.431534 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.442413 4702 scope.go:117] "RemoveContainer" containerID="572d9abc35efa525a77679300431c9690c187dee8d13c0bcd03970e95c31f82d" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.448219 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.461135 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:47 crc kubenswrapper[4702]: E1203 11:33:47.461891 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="ceilometer-central-agent" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.461916 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="ceilometer-central-agent" Dec 03 11:33:47 crc kubenswrapper[4702]: E1203 11:33:47.461949 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="sg-core" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.461958 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="sg-core" Dec 03 11:33:47 crc kubenswrapper[4702]: E1203 11:33:47.462008 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="proxy-httpd" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.462017 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="proxy-httpd" Dec 03 11:33:47 crc kubenswrapper[4702]: E1203 11:33:47.462044 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="ceilometer-notification-agent" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.462054 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="ceilometer-notification-agent" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.462351 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="ceilometer-notification-agent" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.462364 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="proxy-httpd" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.462385 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="ceilometer-central-agent" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.462419 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" containerName="sg-core" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.466524 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.469488 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.473363 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.475029 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.477817 4702 scope.go:117] "RemoveContainer" containerID="7a9101dcea969ca73bc30f513e5a357f4fba936c1db39427c58951d34f7c4c62" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.481721 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.512041 4702 scope.go:117] "RemoveContainer" containerID="910080b47494f0dd52d11b09a3a5070f65dd603a4d6c20720f0e4ff979809a87" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.524066 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-scripts\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.524254 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-run-httpd\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.524282 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.524324 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fs86\" (UniqueName: \"kubernetes.io/projected/7ca15f5e-f93b-42fe-b243-1965f65121c4-kube-api-access-7fs86\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.524363 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.524386 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-log-httpd\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.524447 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-config-data\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.524502 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.626571 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-config-data\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.626673 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.626732 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-scripts\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.626827 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-run-httpd\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.626849 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.626884 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fs86\" (UniqueName: \"kubernetes.io/projected/7ca15f5e-f93b-42fe-b243-1965f65121c4-kube-api-access-7fs86\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.626922 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.626946 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-log-httpd\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.627517 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-log-httpd\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.627838 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-run-httpd\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.631468 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.632049 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.632707 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-config-data\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.635788 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.638938 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-scripts\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.646552 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fs86\" (UniqueName: \"kubernetes.io/projected/7ca15f5e-f93b-42fe-b243-1965f65121c4-kube-api-access-7fs86\") pod \"ceilometer-0\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " pod="openstack/ceilometer-0" Dec 03 11:33:47 crc kubenswrapper[4702]: I1203 11:33:47.800971 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.108492 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.383571 4702 generic.go:334] "Generic (PLEG): container finished" podID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerID="908c6ae29e4601831cec06aa54d973978d1b4b264c09734f0ec062f753d8b827" exitCode=0 Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.383672 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91a66cfe-1f78-4bc2-854d-11003680f6ca","Type":"ContainerDied","Data":"908c6ae29e4601831cec06aa54d973978d1b4b264c09734f0ec062f753d8b827"} Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.383739 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"91a66cfe-1f78-4bc2-854d-11003680f6ca","Type":"ContainerDied","Data":"2816f77ac620499039e217cb06956fa16c7864c9eda02bbc6d7b59cf14289a55"} Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.383772 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2816f77ac620499039e217cb06956fa16c7864c9eda02bbc6d7b59cf14289a55" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.385666 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.450026 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-config-data\") pod \"91a66cfe-1f78-4bc2-854d-11003680f6ca\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.450488 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a66cfe-1f78-4bc2-854d-11003680f6ca-logs\") pod \"91a66cfe-1f78-4bc2-854d-11003680f6ca\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.450530 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-combined-ca-bundle\") pod \"91a66cfe-1f78-4bc2-854d-11003680f6ca\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.450840 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v697h\" (UniqueName: \"kubernetes.io/projected/91a66cfe-1f78-4bc2-854d-11003680f6ca-kube-api-access-v697h\") pod \"91a66cfe-1f78-4bc2-854d-11003680f6ca\" (UID: \"91a66cfe-1f78-4bc2-854d-11003680f6ca\") " Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.451782 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a66cfe-1f78-4bc2-854d-11003680f6ca-logs" (OuterVolumeSpecName: "logs") pod "91a66cfe-1f78-4bc2-854d-11003680f6ca" (UID: "91a66cfe-1f78-4bc2-854d-11003680f6ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.461926 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a66cfe-1f78-4bc2-854d-11003680f6ca-kube-api-access-v697h" (OuterVolumeSpecName: "kube-api-access-v697h") pod "91a66cfe-1f78-4bc2-854d-11003680f6ca" (UID: "91a66cfe-1f78-4bc2-854d-11003680f6ca"). InnerVolumeSpecName "kube-api-access-v697h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.485798 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.594050 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91a66cfe-1f78-4bc2-854d-11003680f6ca-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.594085 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v697h\" (UniqueName: \"kubernetes.io/projected/91a66cfe-1f78-4bc2-854d-11003680f6ca-kube-api-access-v697h\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.617690 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-config-data" (OuterVolumeSpecName: "config-data") pod "91a66cfe-1f78-4bc2-854d-11003680f6ca" (UID: "91a66cfe-1f78-4bc2-854d-11003680f6ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.634576 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91a66cfe-1f78-4bc2-854d-11003680f6ca" (UID: "91a66cfe-1f78-4bc2-854d-11003680f6ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.698825 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.698863 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91a66cfe-1f78-4bc2-854d-11003680f6ca-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:48 crc kubenswrapper[4702]: I1203 11:33:48.945580 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c1ddbc-9b3d-4310-a6ec-dd5416411e40" path="/var/lib/kubelet/pods/42c1ddbc-9b3d-4310-a6ec-dd5416411e40/volumes" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.399451 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.399855 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerStarted","Data":"781cc4c337d5c555c62d26e5e3951483486e65b45e54cda705858c6aa345013a"} Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.399926 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerStarted","Data":"76075c8f884a06da1a9d922b74e6947442ccd08b07bb2e4fb3e0d8629621472e"} Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.436434 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.466571 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.486397 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:49 crc kubenswrapper[4702]: E1203 11:33:49.487144 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-api" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.487173 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-api" Dec 03 11:33:49 crc kubenswrapper[4702]: E1203 11:33:49.487253 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-log" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.487263 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-log" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.487605 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-log" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.487662 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" containerName="nova-api-api" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.489507 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.493153 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.493356 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.495915 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.499985 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.628551 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.628650 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.628686 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36431e8-e244-4c9f-92d4-df351ca9157a-logs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.628865 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-config-data\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.628907 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs88x\" (UniqueName: \"kubernetes.io/projected/e36431e8-e244-4c9f-92d4-df351ca9157a-kube-api-access-hs88x\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.628939 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.757262 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-config-data\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.758033 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs88x\" (UniqueName: \"kubernetes.io/projected/e36431e8-e244-4c9f-92d4-df351ca9157a-kube-api-access-hs88x\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.758091 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.758315 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.758430 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.758510 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36431e8-e244-4c9f-92d4-df351ca9157a-logs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.759163 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36431e8-e244-4c9f-92d4-df351ca9157a-logs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.765331 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-config-data\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.767641 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-public-tls-certs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.768332 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.781197 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.785319 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs88x\" (UniqueName: \"kubernetes.io/projected/e36431e8-e244-4c9f-92d4-df351ca9157a-kube-api-access-hs88x\") pod \"nova-api-0\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.809076 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:33:49 crc kubenswrapper[4702]: I1203 11:33:49.928506 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:33:49 crc kubenswrapper[4702]: E1203 11:33:49.928956 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:33:50 crc kubenswrapper[4702]: W1203 11:33:50.407169 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode36431e8_e244_4c9f_92d4_df351ca9157a.slice/crio-4a06719f8dec31549a2faad3fb04073c7b69dd53f902bb14541180a6f2dd0eab WatchSource:0}: Error finding container 4a06719f8dec31549a2faad3fb04073c7b69dd53f902bb14541180a6f2dd0eab: Status 404 returned error can't find the container with id 4a06719f8dec31549a2faad3fb04073c7b69dd53f902bb14541180a6f2dd0eab Dec 03 11:33:50 crc kubenswrapper[4702]: I1203 11:33:50.421871 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:33:50 crc kubenswrapper[4702]: I1203 11:33:50.421930 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerStarted","Data":"b5f9a9bec3106303c78967ee3e60e9f1eb5178cfde0d005b59629278c6e881f7"} Dec 03 11:33:50 crc kubenswrapper[4702]: I1203 11:33:50.944099 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a66cfe-1f78-4bc2-854d-11003680f6ca" path="/var/lib/kubelet/pods/91a66cfe-1f78-4bc2-854d-11003680f6ca/volumes" Dec 03 11:33:50 crc kubenswrapper[4702]: I1203 11:33:50.964645 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:50 crc kubenswrapper[4702]: I1203 11:33:50.990202 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.437697 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e36431e8-e244-4c9f-92d4-df351ca9157a","Type":"ContainerStarted","Data":"ceb64211b8aa47bffdd4015e1b443e622391819cab8943a0c92f10aea9746a81"} Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.437814 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e36431e8-e244-4c9f-92d4-df351ca9157a","Type":"ContainerStarted","Data":"579978377ef619f15f3295160d671c49b9c10b3f1949e28bb55c6f8d20b40691"} Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.437830 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e36431e8-e244-4c9f-92d4-df351ca9157a","Type":"ContainerStarted","Data":"4a06719f8dec31549a2faad3fb04073c7b69dd53f902bb14541180a6f2dd0eab"} Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.443207 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerStarted","Data":"7e38e290353bda13b7b3ccd4a2ca02379cec57a7c4fa497b86a34bfb54086848"} Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.471929 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.472101 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.472084164 podStartE2EDuration="2.472084164s" podCreationTimestamp="2025-12-03 11:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:51.460555996 +0000 UTC m=+1815.296484460" watchObservedRunningTime="2025-12-03 11:33:51.472084164 +0000 UTC m=+1815.308012638" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.630600 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-24b4q"] Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.633244 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.636283 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.645678 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.652379 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-24b4q"] Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.825996 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbhml\" (UniqueName: \"kubernetes.io/projected/36eb0771-1605-478d-aad3-44f0e6f0932b-kube-api-access-sbhml\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.826179 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.826258 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-config-data\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.826312 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-scripts\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.930675 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.930912 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-config-data\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.930981 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-scripts\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.931308 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbhml\" (UniqueName: \"kubernetes.io/projected/36eb0771-1605-478d-aad3-44f0e6f0932b-kube-api-access-sbhml\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.936791 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.936872 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-config-data\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.940703 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-scripts\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.950324 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbhml\" (UniqueName: \"kubernetes.io/projected/36eb0771-1605-478d-aad3-44f0e6f0932b-kube-api-access-sbhml\") pod \"nova-cell1-cell-mapping-24b4q\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:51 crc kubenswrapper[4702]: I1203 11:33:51.961230 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:33:52 crc kubenswrapper[4702]: I1203 11:33:52.118825 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:33:52 crc kubenswrapper[4702]: I1203 11:33:52.243028 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4gpg6"] Dec 03 11:33:52 crc kubenswrapper[4702]: I1203 11:33:52.243677 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" podUID="45dcdd57-8d89-4094-a217-c6c58eeaba18" containerName="dnsmasq-dns" containerID="cri-o://ce2b5a886cfa45212e483b44b721c4d9a21de5eda8e5335141fbd27e4fdb44ab" gracePeriod=10 Dec 03 11:33:52 crc kubenswrapper[4702]: I1203 11:33:52.507394 4702 generic.go:334] "Generic (PLEG): container finished" podID="45dcdd57-8d89-4094-a217-c6c58eeaba18" containerID="ce2b5a886cfa45212e483b44b721c4d9a21de5eda8e5335141fbd27e4fdb44ab" exitCode=0 Dec 03 11:33:52 crc kubenswrapper[4702]: I1203 11:33:52.507453 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" event={"ID":"45dcdd57-8d89-4094-a217-c6c58eeaba18","Type":"ContainerDied","Data":"ce2b5a886cfa45212e483b44b721c4d9a21de5eda8e5335141fbd27e4fdb44ab"} Dec 03 11:33:52 crc kubenswrapper[4702]: I1203 11:33:52.821255 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-24b4q"] Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.180833 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.299148 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-swift-storage-0\") pod \"45dcdd57-8d89-4094-a217-c6c58eeaba18\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.299215 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4zs\" (UniqueName: \"kubernetes.io/projected/45dcdd57-8d89-4094-a217-c6c58eeaba18-kube-api-access-vd4zs\") pod \"45dcdd57-8d89-4094-a217-c6c58eeaba18\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.299246 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-nb\") pod \"45dcdd57-8d89-4094-a217-c6c58eeaba18\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.299375 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-sb\") pod \"45dcdd57-8d89-4094-a217-c6c58eeaba18\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.299506 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-svc\") pod \"45dcdd57-8d89-4094-a217-c6c58eeaba18\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.299624 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-config\") pod \"45dcdd57-8d89-4094-a217-c6c58eeaba18\" (UID: \"45dcdd57-8d89-4094-a217-c6c58eeaba18\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.339070 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45dcdd57-8d89-4094-a217-c6c58eeaba18-kube-api-access-vd4zs" (OuterVolumeSpecName: "kube-api-access-vd4zs") pod "45dcdd57-8d89-4094-a217-c6c58eeaba18" (UID: "45dcdd57-8d89-4094-a217-c6c58eeaba18"). InnerVolumeSpecName "kube-api-access-vd4zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.403580 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4zs\" (UniqueName: \"kubernetes.io/projected/45dcdd57-8d89-4094-a217-c6c58eeaba18-kube-api-access-vd4zs\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.456001 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-config" (OuterVolumeSpecName: "config") pod "45dcdd57-8d89-4094-a217-c6c58eeaba18" (UID: "45dcdd57-8d89-4094-a217-c6c58eeaba18"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.460943 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45dcdd57-8d89-4094-a217-c6c58eeaba18" (UID: "45dcdd57-8d89-4094-a217-c6c58eeaba18"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.461009 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45dcdd57-8d89-4094-a217-c6c58eeaba18" (UID: "45dcdd57-8d89-4094-a217-c6c58eeaba18"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.479174 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45dcdd57-8d89-4094-a217-c6c58eeaba18" (UID: "45dcdd57-8d89-4094-a217-c6c58eeaba18"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.506117 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.512930 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.512964 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.512978 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.535945 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45dcdd57-8d89-4094-a217-c6c58eeaba18" (UID: "45dcdd57-8d89-4094-a217-c6c58eeaba18"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.540168 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerStarted","Data":"e0e2b9731c1b0dee5490090ec7f9544735670801c455095df15120ade1500b1e"} Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.540378 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="ceilometer-central-agent" containerID="cri-o://781cc4c337d5c555c62d26e5e3951483486e65b45e54cda705858c6aa345013a" gracePeriod=30 Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.540691 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.540994 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="proxy-httpd" containerID="cri-o://e0e2b9731c1b0dee5490090ec7f9544735670801c455095df15120ade1500b1e" gracePeriod=30 Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.541041 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="sg-core" containerID="cri-o://7e38e290353bda13b7b3ccd4a2ca02379cec57a7c4fa497b86a34bfb54086848" gracePeriod=30 Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.541254 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="ceilometer-notification-agent" containerID="cri-o://b5f9a9bec3106303c78967ee3e60e9f1eb5178cfde0d005b59629278c6e881f7" gracePeriod=30 Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.549554 4702 generic.go:334] "Generic (PLEG): container finished" podID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerID="66abc7f2627c67b2b75db9c1003426301da000aa5c77c32e254f0ecf3eddc1de" exitCode=137 Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.549646 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerDied","Data":"66abc7f2627c67b2b75db9c1003426301da000aa5c77c32e254f0ecf3eddc1de"} Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.567094 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-24b4q" event={"ID":"36eb0771-1605-478d-aad3-44f0e6f0932b","Type":"ContainerStarted","Data":"03a377a86e254cfdced08b0b9895fb8194a6a6b0393febbb1ba40e496277b4d3"} Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.567166 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-24b4q" event={"ID":"36eb0771-1605-478d-aad3-44f0e6f0932b","Type":"ContainerStarted","Data":"52af0d7fb8c6f703494273a8bfc4bb74c5991a7a4c903328f943fee54b83f98b"} Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.585059 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.299043273 podStartE2EDuration="6.585033177s" podCreationTimestamp="2025-12-03 11:33:47 +0000 UTC" firstStartedPulling="2025-12-03 11:33:48.598122996 +0000 UTC m=+1812.434051460" lastFinishedPulling="2025-12-03 11:33:51.8841129 +0000 UTC m=+1815.720041364" observedRunningTime="2025-12-03 11:33:53.572083638 +0000 UTC m=+1817.408012112" watchObservedRunningTime="2025-12-03 11:33:53.585033177 +0000 UTC m=+1817.420961641" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.586886 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" event={"ID":"45dcdd57-8d89-4094-a217-c6c58eeaba18","Type":"ContainerDied","Data":"e92fe5918b532cb212bce0662f2e764c14b09c4b4180906870ec1370235ec064"} Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.586952 4702 scope.go:117] "RemoveContainer" containerID="ce2b5a886cfa45212e483b44b721c4d9a21de5eda8e5335141fbd27e4fdb44ab" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.586952 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-4gpg6" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.593311 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-24b4q" podStartSLOduration=2.593288412 podStartE2EDuration="2.593288412s" podCreationTimestamp="2025-12-03 11:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:33:53.592288504 +0000 UTC m=+1817.428216968" watchObservedRunningTime="2025-12-03 11:33:53.593288412 +0000 UTC m=+1817.429216896" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.618228 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45dcdd57-8d89-4094-a217-c6c58eeaba18-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.643964 4702 scope.go:117] "RemoveContainer" containerID="3502f437825d4954b2c823b4cc99bac5917ce9e678ef3e0aaff65a4726fff3d4" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.806125 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4gpg6"] Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.834554 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-4gpg6"] Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.858310 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.988991 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-combined-ca-bundle\") pod \"6148de9b-9833-4042-ad7d-5e40b93369ea\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.989690 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-config-data\") pod \"6148de9b-9833-4042-ad7d-5e40b93369ea\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.989746 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x4xv\" (UniqueName: \"kubernetes.io/projected/6148de9b-9833-4042-ad7d-5e40b93369ea-kube-api-access-6x4xv\") pod \"6148de9b-9833-4042-ad7d-5e40b93369ea\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " Dec 03 11:33:53 crc kubenswrapper[4702]: I1203 11:33:53.989906 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-scripts\") pod \"6148de9b-9833-4042-ad7d-5e40b93369ea\" (UID: \"6148de9b-9833-4042-ad7d-5e40b93369ea\") " Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.025793 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-scripts" (OuterVolumeSpecName: "scripts") pod "6148de9b-9833-4042-ad7d-5e40b93369ea" (UID: "6148de9b-9833-4042-ad7d-5e40b93369ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.056125 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6148de9b-9833-4042-ad7d-5e40b93369ea-kube-api-access-6x4xv" (OuterVolumeSpecName: "kube-api-access-6x4xv") pod "6148de9b-9833-4042-ad7d-5e40b93369ea" (UID: "6148de9b-9833-4042-ad7d-5e40b93369ea"). InnerVolumeSpecName "kube-api-access-6x4xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.108064 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.108104 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x4xv\" (UniqueName: \"kubernetes.io/projected/6148de9b-9833-4042-ad7d-5e40b93369ea-kube-api-access-6x4xv\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.269984 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6148de9b-9833-4042-ad7d-5e40b93369ea" (UID: "6148de9b-9833-4042-ad7d-5e40b93369ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.297192 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-config-data" (OuterVolumeSpecName: "config-data") pod "6148de9b-9833-4042-ad7d-5e40b93369ea" (UID: "6148de9b-9833-4042-ad7d-5e40b93369ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.312392 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.312452 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6148de9b-9833-4042-ad7d-5e40b93369ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.602780 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6148de9b-9833-4042-ad7d-5e40b93369ea","Type":"ContainerDied","Data":"431c3ad555728bd0adfac4ebc137449a7d0f0b85f1406b302fa4f816b8bb5cf5"} Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.602849 4702 scope.go:117] "RemoveContainer" containerID="66abc7f2627c67b2b75db9c1003426301da000aa5c77c32e254f0ecf3eddc1de" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.603063 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.617800 4702 generic.go:334] "Generic (PLEG): container finished" podID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerID="7e38e290353bda13b7b3ccd4a2ca02379cec57a7c4fa497b86a34bfb54086848" exitCode=2 Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.617836 4702 generic.go:334] "Generic (PLEG): container finished" podID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerID="b5f9a9bec3106303c78967ee3e60e9f1eb5178cfde0d005b59629278c6e881f7" exitCode=0 Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.617899 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerDied","Data":"7e38e290353bda13b7b3ccd4a2ca02379cec57a7c4fa497b86a34bfb54086848"} Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.618013 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerDied","Data":"b5f9a9bec3106303c78967ee3e60e9f1eb5178cfde0d005b59629278c6e881f7"} Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.649273 4702 scope.go:117] "RemoveContainer" containerID="d91e72e8f66c9bd3e87823654f464d7cf4409cb8f4d86bcf9adb2e3321b7c30b" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.650404 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.667944 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.689337 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:54 crc kubenswrapper[4702]: E1203 11:33:54.690056 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-evaluator" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690079 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-evaluator" Dec 03 11:33:54 crc kubenswrapper[4702]: E1203 11:33:54.690096 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-api" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690103 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-api" Dec 03 11:33:54 crc kubenswrapper[4702]: E1203 11:33:54.690118 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-listener" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690124 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-listener" Dec 03 11:33:54 crc kubenswrapper[4702]: E1203 11:33:54.690143 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45dcdd57-8d89-4094-a217-c6c58eeaba18" containerName="dnsmasq-dns" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690148 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="45dcdd57-8d89-4094-a217-c6c58eeaba18" containerName="dnsmasq-dns" Dec 03 11:33:54 crc kubenswrapper[4702]: E1203 11:33:54.690176 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45dcdd57-8d89-4094-a217-c6c58eeaba18" containerName="init" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690183 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="45dcdd57-8d89-4094-a217-c6c58eeaba18" containerName="init" Dec 03 11:33:54 crc kubenswrapper[4702]: E1203 11:33:54.690196 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-notifier" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690202 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-notifier" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690421 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-api" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690435 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-listener" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690461 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="45dcdd57-8d89-4094-a217-c6c58eeaba18" containerName="dnsmasq-dns" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690469 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-evaluator" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.690485 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" containerName="aodh-notifier" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.692985 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.696281 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.696536 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.696610 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.696721 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.696939 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-swrfb" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.717249 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.727088 4702 scope.go:117] "RemoveContainer" containerID="209908441ff64d788b841146150f0e1ed97b1c2aecebd4447c3e03f092dcbc46" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.752148 4702 scope.go:117] "RemoveContainer" containerID="2d1833a0368e77342fa5f6686847efaa7708f9c2d38570bb1a7f9036e9c6a75d" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.767033 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-internal-tls-certs\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.767922 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-config-data\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.768160 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.768368 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvtzc\" (UniqueName: \"kubernetes.io/projected/a7698d4e-71ca-4e83-995e-48f9a42c0490-kube-api-access-gvtzc\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.768486 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-public-tls-certs\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.768586 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-scripts\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.872250 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.872501 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvtzc\" (UniqueName: \"kubernetes.io/projected/a7698d4e-71ca-4e83-995e-48f9a42c0490-kube-api-access-gvtzc\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.872601 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-public-tls-certs\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.872662 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-scripts\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.873040 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-internal-tls-certs\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.874342 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-config-data\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.878334 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-public-tls-certs\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.878535 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-internal-tls-certs\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.879692 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-config-data\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.879741 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-combined-ca-bundle\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.881135 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-scripts\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.896508 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvtzc\" (UniqueName: \"kubernetes.io/projected/a7698d4e-71ca-4e83-995e-48f9a42c0490-kube-api-access-gvtzc\") pod \"aodh-0\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " pod="openstack/aodh-0" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.943542 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45dcdd57-8d89-4094-a217-c6c58eeaba18" path="/var/lib/kubelet/pods/45dcdd57-8d89-4094-a217-c6c58eeaba18/volumes" Dec 03 11:33:54 crc kubenswrapper[4702]: I1203 11:33:54.944298 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6148de9b-9833-4042-ad7d-5e40b93369ea" path="/var/lib/kubelet/pods/6148de9b-9833-4042-ad7d-5e40b93369ea/volumes" Dec 03 11:33:55 crc kubenswrapper[4702]: I1203 11:33:55.029290 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:33:55 crc kubenswrapper[4702]: I1203 11:33:55.641863 4702 generic.go:334] "Generic (PLEG): container finished" podID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerID="e0e2b9731c1b0dee5490090ec7f9544735670801c455095df15120ade1500b1e" exitCode=0 Dec 03 11:33:55 crc kubenswrapper[4702]: I1203 11:33:55.641945 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerDied","Data":"e0e2b9731c1b0dee5490090ec7f9544735670801c455095df15120ade1500b1e"} Dec 03 11:33:55 crc kubenswrapper[4702]: I1203 11:33:55.703725 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 11:33:55 crc kubenswrapper[4702]: W1203 11:33:55.705979 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7698d4e_71ca_4e83_995e_48f9a42c0490.slice/crio-ced38c645dd8c7362f0e33c9d2777a25b8a0633f5c20ec9000fa0263fd957340 WatchSource:0}: Error finding container ced38c645dd8c7362f0e33c9d2777a25b8a0633f5c20ec9000fa0263fd957340: Status 404 returned error can't find the container with id ced38c645dd8c7362f0e33c9d2777a25b8a0633f5c20ec9000fa0263fd957340 Dec 03 11:33:56 crc kubenswrapper[4702]: I1203 11:33:56.689441 4702 generic.go:334] "Generic (PLEG): container finished" podID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerID="781cc4c337d5c555c62d26e5e3951483486e65b45e54cda705858c6aa345013a" exitCode=0 Dec 03 11:33:56 crc kubenswrapper[4702]: I1203 11:33:56.689539 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerDied","Data":"781cc4c337d5c555c62d26e5e3951483486e65b45e54cda705858c6aa345013a"} Dec 03 11:33:56 crc kubenswrapper[4702]: I1203 11:33:56.692222 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerStarted","Data":"ced38c645dd8c7362f0e33c9d2777a25b8a0633f5c20ec9000fa0263fd957340"} Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.241848 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.336849 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-log-httpd\") pod \"7ca15f5e-f93b-42fe-b243-1965f65121c4\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.336999 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fs86\" (UniqueName: \"kubernetes.io/projected/7ca15f5e-f93b-42fe-b243-1965f65121c4-kube-api-access-7fs86\") pod \"7ca15f5e-f93b-42fe-b243-1965f65121c4\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.337037 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-run-httpd\") pod \"7ca15f5e-f93b-42fe-b243-1965f65121c4\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.337103 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-combined-ca-bundle\") pod \"7ca15f5e-f93b-42fe-b243-1965f65121c4\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.337176 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-ceilometer-tls-certs\") pod \"7ca15f5e-f93b-42fe-b243-1965f65121c4\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.337310 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-scripts\") pod \"7ca15f5e-f93b-42fe-b243-1965f65121c4\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.337406 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-sg-core-conf-yaml\") pod \"7ca15f5e-f93b-42fe-b243-1965f65121c4\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.337424 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-config-data\") pod \"7ca15f5e-f93b-42fe-b243-1965f65121c4\" (UID: \"7ca15f5e-f93b-42fe-b243-1965f65121c4\") " Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.337399 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7ca15f5e-f93b-42fe-b243-1965f65121c4" (UID: "7ca15f5e-f93b-42fe-b243-1965f65121c4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.338821 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7ca15f5e-f93b-42fe-b243-1965f65121c4" (UID: "7ca15f5e-f93b-42fe-b243-1965f65121c4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.339044 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.339066 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7ca15f5e-f93b-42fe-b243-1965f65121c4-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.342950 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-scripts" (OuterVolumeSpecName: "scripts") pod "7ca15f5e-f93b-42fe-b243-1965f65121c4" (UID: "7ca15f5e-f93b-42fe-b243-1965f65121c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.343136 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ca15f5e-f93b-42fe-b243-1965f65121c4-kube-api-access-7fs86" (OuterVolumeSpecName: "kube-api-access-7fs86") pod "7ca15f5e-f93b-42fe-b243-1965f65121c4" (UID: "7ca15f5e-f93b-42fe-b243-1965f65121c4"). InnerVolumeSpecName "kube-api-access-7fs86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.375861 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7ca15f5e-f93b-42fe-b243-1965f65121c4" (UID: "7ca15f5e-f93b-42fe-b243-1965f65121c4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.416430 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7ca15f5e-f93b-42fe-b243-1965f65121c4" (UID: "7ca15f5e-f93b-42fe-b243-1965f65121c4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.440702 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.440734 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.440744 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.440770 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fs86\" (UniqueName: \"kubernetes.io/projected/7ca15f5e-f93b-42fe-b243-1965f65121c4-kube-api-access-7fs86\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.441939 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ca15f5e-f93b-42fe-b243-1965f65121c4" (UID: "7ca15f5e-f93b-42fe-b243-1965f65121c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.471559 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-config-data" (OuterVolumeSpecName: "config-data") pod "7ca15f5e-f93b-42fe-b243-1965f65121c4" (UID: "7ca15f5e-f93b-42fe-b243-1965f65121c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.544189 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.544271 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ca15f5e-f93b-42fe-b243-1965f65121c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.826137 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerStarted","Data":"bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943"} Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.839441 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7ca15f5e-f93b-42fe-b243-1965f65121c4","Type":"ContainerDied","Data":"76075c8f884a06da1a9d922b74e6947442ccd08b07bb2e4fb3e0d8629621472e"} Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.839747 4702 scope.go:117] "RemoveContainer" containerID="e0e2b9731c1b0dee5490090ec7f9544735670801c455095df15120ade1500b1e" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.839819 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.896220 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.897372 4702 scope.go:117] "RemoveContainer" containerID="7e38e290353bda13b7b3ccd4a2ca02379cec57a7c4fa497b86a34bfb54086848" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.925112 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.935584 4702 scope.go:117] "RemoveContainer" containerID="b5f9a9bec3106303c78967ee3e60e9f1eb5178cfde0d005b59629278c6e881f7" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.947914 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:57 crc kubenswrapper[4702]: E1203 11:33:57.948579 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="ceilometer-central-agent" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.948596 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="ceilometer-central-agent" Dec 03 11:33:57 crc kubenswrapper[4702]: E1203 11:33:57.948612 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="proxy-httpd" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.948618 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="proxy-httpd" Dec 03 11:33:57 crc kubenswrapper[4702]: E1203 11:33:57.948649 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="sg-core" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.948655 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="sg-core" Dec 03 11:33:57 crc kubenswrapper[4702]: E1203 11:33:57.948700 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="ceilometer-notification-agent" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.948710 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="ceilometer-notification-agent" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.949020 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="proxy-httpd" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.949039 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="sg-core" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.949064 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="ceilometer-central-agent" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.949075 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" containerName="ceilometer-notification-agent" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.953483 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.965135 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.965265 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.965425 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:33:57 crc kubenswrapper[4702]: I1203 11:33:57.966925 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.001579 4702 scope.go:117] "RemoveContainer" containerID="781cc4c337d5c555c62d26e5e3951483486e65b45e54cda705858c6aa345013a" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.026335 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-config-data\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.026415 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-run-httpd\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.026449 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-log-httpd\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.026531 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.026563 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.026597 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.026618 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg2cb\" (UniqueName: \"kubernetes.io/projected/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-kube-api-access-rg2cb\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.026634 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-scripts\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.129335 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-run-httpd\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.129790 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-log-httpd\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.129910 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.130001 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.130067 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.130100 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg2cb\" (UniqueName: \"kubernetes.io/projected/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-kube-api-access-rg2cb\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.130132 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-scripts\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.130191 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-run-httpd\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.130239 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-log-httpd\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.130735 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-config-data\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.139087 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.139592 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-scripts\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.140020 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-config-data\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.140381 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.146629 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.155489 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg2cb\" (UniqueName: \"kubernetes.io/projected/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-kube-api-access-rg2cb\") pod \"ceilometer-0\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.293834 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.863711 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerStarted","Data":"e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129"} Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.864192 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerStarted","Data":"c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb"} Dec 03 11:33:58 crc kubenswrapper[4702]: W1203 11:33:58.931727 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa73e1e2_6ead_4a19_a9a7_446f3dce5e51.slice/crio-bff9e5c8ba29fc68244c72be26afdc8d3528588ec4a8483b501a77236d2b6b33 WatchSource:0}: Error finding container bff9e5c8ba29fc68244c72be26afdc8d3528588ec4a8483b501a77236d2b6b33: Status 404 returned error can't find the container with id bff9e5c8ba29fc68244c72be26afdc8d3528588ec4a8483b501a77236d2b6b33 Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.954352 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ca15f5e-f93b-42fe-b243-1965f65121c4" path="/var/lib/kubelet/pods/7ca15f5e-f93b-42fe-b243-1965f65121c4/volumes" Dec 03 11:33:58 crc kubenswrapper[4702]: I1203 11:33:58.955604 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:34:00 crc kubenswrapper[4702]: I1203 11:34:00.113227 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 11:34:00 crc kubenswrapper[4702]: I1203 11:34:00.113586 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 11:34:00 crc kubenswrapper[4702]: I1203 11:34:00.147629 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerStarted","Data":"bff9e5c8ba29fc68244c72be26afdc8d3528588ec4a8483b501a77236d2b6b33"} Dec 03 11:34:00 crc kubenswrapper[4702]: I1203 11:34:00.929139 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:34:00 crc kubenswrapper[4702]: E1203 11:34:00.929811 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:34:01 crc kubenswrapper[4702]: I1203 11:34:01.302322 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.1:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:34:01 crc kubenswrapper[4702]: I1203 11:34:01.312228 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.1:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:34:01 crc kubenswrapper[4702]: I1203 11:34:01.374727 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerStarted","Data":"54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b"} Dec 03 11:34:01 crc kubenswrapper[4702]: I1203 11:34:01.405398 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerStarted","Data":"6b161e8cfcbcec9fcf2d70af85019391d37b62c61afcd6edb67b11f5cf9b795a"} Dec 03 11:34:05 crc kubenswrapper[4702]: I1203 11:34:05.337929 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerStarted","Data":"e60b69678a5459e7554a2e3a9ea25afa31fe05bafcdef2b429ff0761434b8dec"} Dec 03 11:34:06 crc kubenswrapper[4702]: I1203 11:34:06.288640 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerStarted","Data":"c61e0bdeacef5ab9d3748b3eddaff5993c6ebd0d7e0c4d83697ffb90a320f86d"} Dec 03 11:34:07 crc kubenswrapper[4702]: I1203 11:34:07.201442 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=9.37988287 podStartE2EDuration="13.201382667s" podCreationTimestamp="2025-12-03 11:33:54 +0000 UTC" firstStartedPulling="2025-12-03 11:33:55.715780626 +0000 UTC m=+1819.551709090" lastFinishedPulling="2025-12-03 11:33:59.537280423 +0000 UTC m=+1823.373208887" observedRunningTime="2025-12-03 11:34:01.405771233 +0000 UTC m=+1825.241699697" watchObservedRunningTime="2025-12-03 11:34:07.201382667 +0000 UTC m=+1831.037311131" Dec 03 11:34:07 crc kubenswrapper[4702]: I1203 11:34:07.320308 4702 generic.go:334] "Generic (PLEG): container finished" podID="36eb0771-1605-478d-aad3-44f0e6f0932b" containerID="03a377a86e254cfdced08b0b9895fb8194a6a6b0393febbb1ba40e496277b4d3" exitCode=0 Dec 03 11:34:07 crc kubenswrapper[4702]: I1203 11:34:07.320398 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-24b4q" event={"ID":"36eb0771-1605-478d-aad3-44f0e6f0932b","Type":"ContainerDied","Data":"03a377a86e254cfdced08b0b9895fb8194a6a6b0393febbb1ba40e496277b4d3"} Dec 03 11:34:08 crc kubenswrapper[4702]: I1203 11:34:08.337029 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerStarted","Data":"10b6764393e624d2c417eb58f663001da9509c823d90e97a17e2dbf7658fd428"} Dec 03 11:34:08 crc kubenswrapper[4702]: I1203 11:34:08.337348 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:34:08 crc kubenswrapper[4702]: I1203 11:34:08.382677 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.545663752 podStartE2EDuration="11.382643283s" podCreationTimestamp="2025-12-03 11:33:57 +0000 UTC" firstStartedPulling="2025-12-03 11:33:58.93470463 +0000 UTC m=+1822.770633094" lastFinishedPulling="2025-12-03 11:34:07.771684161 +0000 UTC m=+1831.607612625" observedRunningTime="2025-12-03 11:34:08.361866282 +0000 UTC m=+1832.197794746" watchObservedRunningTime="2025-12-03 11:34:08.382643283 +0000 UTC m=+1832.218571767" Dec 03 11:34:09 crc kubenswrapper[4702]: I1203 11:34:09.806754 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" podUID="4b90477f-d1b5-4f03-ab08-2476d44a9cff" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 11:34:09 crc kubenswrapper[4702]: I1203 11:34:09.865251 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 11:34:09 crc kubenswrapper[4702]: I1203 11:34:09.865491 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 11:34:09 crc kubenswrapper[4702]: I1203 11:34:09.867249 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 11:34:09 crc kubenswrapper[4702]: I1203 11:34:09.869089 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 11:34:09 crc kubenswrapper[4702]: I1203 11:34:09.891454 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 11:34:09 crc kubenswrapper[4702]: I1203 11:34:09.891683 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.491193 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.574662 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbhml\" (UniqueName: \"kubernetes.io/projected/36eb0771-1605-478d-aad3-44f0e6f0932b-kube-api-access-sbhml\") pod \"36eb0771-1605-478d-aad3-44f0e6f0932b\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.574850 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-combined-ca-bundle\") pod \"36eb0771-1605-478d-aad3-44f0e6f0932b\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.574905 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-config-data\") pod \"36eb0771-1605-478d-aad3-44f0e6f0932b\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.575077 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-scripts\") pod \"36eb0771-1605-478d-aad3-44f0e6f0932b\" (UID: \"36eb0771-1605-478d-aad3-44f0e6f0932b\") " Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.581799 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-scripts" (OuterVolumeSpecName: "scripts") pod "36eb0771-1605-478d-aad3-44f0e6f0932b" (UID: "36eb0771-1605-478d-aad3-44f0e6f0932b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.583315 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36eb0771-1605-478d-aad3-44f0e6f0932b-kube-api-access-sbhml" (OuterVolumeSpecName: "kube-api-access-sbhml") pod "36eb0771-1605-478d-aad3-44f0e6f0932b" (UID: "36eb0771-1605-478d-aad3-44f0e6f0932b"). InnerVolumeSpecName "kube-api-access-sbhml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.621105 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36eb0771-1605-478d-aad3-44f0e6f0932b" (UID: "36eb0771-1605-478d-aad3-44f0e6f0932b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.637905 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-config-data" (OuterVolumeSpecName: "config-data") pod "36eb0771-1605-478d-aad3-44f0e6f0932b" (UID: "36eb0771-1605-478d-aad3-44f0e6f0932b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.680527 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbhml\" (UniqueName: \"kubernetes.io/projected/36eb0771-1605-478d-aad3-44f0e6f0932b-kube-api-access-sbhml\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.680569 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.680580 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.680589 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb0771-1605-478d-aad3-44f0e6f0932b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.869790 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-24b4q" event={"ID":"36eb0771-1605-478d-aad3-44f0e6f0932b","Type":"ContainerDied","Data":"52af0d7fb8c6f703494273a8bfc4bb74c5991a7a4c903328f943fee54b83f98b"} Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.870176 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52af0d7fb8c6f703494273a8bfc4bb74c5991a7a4c903328f943fee54b83f98b" Dec 03 11:34:10 crc kubenswrapper[4702]: I1203 11:34:10.869834 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-24b4q" Dec 03 11:34:11 crc kubenswrapper[4702]: I1203 11:34:11.634308 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:34:11 crc kubenswrapper[4702]: I1203 11:34:11.656225 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:34:11 crc kubenswrapper[4702]: I1203 11:34:11.656869 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="49564875-fe0f-45b7-b72f-f9583f4a80be" containerName="nova-scheduler-scheduler" containerID="cri-o://ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5" gracePeriod=30 Dec 03 11:34:11 crc kubenswrapper[4702]: I1203 11:34:11.689459 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:34:11 crc kubenswrapper[4702]: I1203 11:34:11.690103 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-log" containerID="cri-o://8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df" gracePeriod=30 Dec 03 11:34:11 crc kubenswrapper[4702]: I1203 11:34:11.690125 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-metadata" containerID="cri-o://0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023" gracePeriod=30 Dec 03 11:34:11 crc kubenswrapper[4702]: I1203 11:34:11.885913 4702 generic.go:334] "Generic (PLEG): container finished" podID="6120d0cd-f573-4f9c-af92-77580df0916c" containerID="8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df" exitCode=143 Dec 03 11:34:11 crc kubenswrapper[4702]: I1203 11:34:11.886027 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6120d0cd-f573-4f9c-af92-77580df0916c","Type":"ContainerDied","Data":"8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df"} Dec 03 11:34:12 crc kubenswrapper[4702]: I1203 11:34:12.898546 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-log" containerID="cri-o://579978377ef619f15f3295160d671c49b9c10b3f1949e28bb55c6f8d20b40691" gracePeriod=30 Dec 03 11:34:12 crc kubenswrapper[4702]: I1203 11:34:12.898648 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-api" containerID="cri-o://ceb64211b8aa47bffdd4015e1b443e622391819cab8943a0c92f10aea9746a81" gracePeriod=30 Dec 03 11:34:13 crc kubenswrapper[4702]: I1203 11:34:13.914782 4702 generic.go:334] "Generic (PLEG): container finished" podID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerID="579978377ef619f15f3295160d671c49b9c10b3f1949e28bb55c6f8d20b40691" exitCode=143 Dec 03 11:34:13 crc kubenswrapper[4702]: I1203 11:34:13.915090 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e36431e8-e244-4c9f-92d4-df351ca9157a","Type":"ContainerDied","Data":"579978377ef619f15f3295160d671c49b9c10b3f1949e28bb55c6f8d20b40691"} Dec 03 11:34:14 crc kubenswrapper[4702]: I1203 11:34:14.621419 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:34:14 crc kubenswrapper[4702]: I1203 11:34:14.985657 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:34:14 crc kubenswrapper[4702]: E1203 11:34:14.986145 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:34:14 crc kubenswrapper[4702]: I1203 11:34:14.990453 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltm95\" (UniqueName: \"kubernetes.io/projected/49564875-fe0f-45b7-b72f-f9583f4a80be-kube-api-access-ltm95\") pod \"49564875-fe0f-45b7-b72f-f9583f4a80be\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " Dec 03 11:34:14 crc kubenswrapper[4702]: I1203 11:34:14.990558 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-combined-ca-bundle\") pod \"49564875-fe0f-45b7-b72f-f9583f4a80be\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " Dec 03 11:34:14 crc kubenswrapper[4702]: I1203 11:34:14.990871 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-config-data\") pod \"49564875-fe0f-45b7-b72f-f9583f4a80be\" (UID: \"49564875-fe0f-45b7-b72f-f9583f4a80be\") " Dec 03 11:34:14 crc kubenswrapper[4702]: I1203 11:34:14.999346 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49564875-fe0f-45b7-b72f-f9583f4a80be-kube-api-access-ltm95" (OuterVolumeSpecName: "kube-api-access-ltm95") pod "49564875-fe0f-45b7-b72f-f9583f4a80be" (UID: "49564875-fe0f-45b7-b72f-f9583f4a80be"). InnerVolumeSpecName "kube-api-access-ltm95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.019062 4702 generic.go:334] "Generic (PLEG): container finished" podID="49564875-fe0f-45b7-b72f-f9583f4a80be" containerID="ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5" exitCode=0 Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.019224 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.055005 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49564875-fe0f-45b7-b72f-f9583f4a80be" (UID: "49564875-fe0f-45b7-b72f-f9583f4a80be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.068966 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-config-data" (OuterVolumeSpecName: "config-data") pod "49564875-fe0f-45b7-b72f-f9583f4a80be" (UID: "49564875-fe0f-45b7-b72f-f9583f4a80be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.097194 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.100091 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltm95\" (UniqueName: \"kubernetes.io/projected/49564875-fe0f-45b7-b72f-f9583f4a80be-kube-api-access-ltm95\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.100543 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49564875-fe0f-45b7-b72f-f9583f4a80be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.262658 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49564875-fe0f-45b7-b72f-f9583f4a80be","Type":"ContainerDied","Data":"ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5"} Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.262729 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"49564875-fe0f-45b7-b72f-f9583f4a80be","Type":"ContainerDied","Data":"2d0aa5a9ac732c1b0cbd1d05f82f8631e3ce0d2a728ddd5cdbcb8d69a07e652d"} Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.262850 4702 scope.go:117] "RemoveContainer" containerID="ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.302337 4702 scope.go:117] "RemoveContainer" containerID="ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5" Dec 03 11:34:15 crc kubenswrapper[4702]: E1203 11:34:15.303287 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5\": container with ID starting with ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5 not found: ID does not exist" containerID="ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.303346 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5"} err="failed to get container status \"ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5\": rpc error: code = NotFound desc = could not find container \"ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5\": container with ID starting with ccb09db36dd0e53ec59db328ef4a9314eaf337ada7b0844c27bbd190046c42b5 not found: ID does not exist" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.389630 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.433690 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.459022 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:34:15 crc kubenswrapper[4702]: E1203 11:34:15.466270 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49564875-fe0f-45b7-b72f-f9583f4a80be" containerName="nova-scheduler-scheduler" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.466379 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="49564875-fe0f-45b7-b72f-f9583f4a80be" containerName="nova-scheduler-scheduler" Dec 03 11:34:15 crc kubenswrapper[4702]: E1203 11:34:15.466474 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eb0771-1605-478d-aad3-44f0e6f0932b" containerName="nova-manage" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.466487 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eb0771-1605-478d-aad3-44f0e6f0932b" containerName="nova-manage" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.467232 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="36eb0771-1605-478d-aad3-44f0e6f0932b" containerName="nova-manage" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.467265 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="49564875-fe0f-45b7-b72f-f9583f4a80be" containerName="nova-scheduler-scheduler" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.468594 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.472943 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.482615 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.539996 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810e51fd-c7d4-4d56-9baf-a1abcad8b348-config-data\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.540089 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810e51fd-c7d4-4d56-9baf-a1abcad8b348-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.540700 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwbgm\" (UniqueName: \"kubernetes.io/projected/810e51fd-c7d4-4d56-9baf-a1abcad8b348-kube-api-access-dwbgm\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.633824 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.645028 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810e51fd-c7d4-4d56-9baf-a1abcad8b348-config-data\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.645091 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810e51fd-c7d4-4d56-9baf-a1abcad8b348-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.645325 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwbgm\" (UniqueName: \"kubernetes.io/projected/810e51fd-c7d4-4d56-9baf-a1abcad8b348-kube-api-access-dwbgm\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.654069 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/810e51fd-c7d4-4d56-9baf-a1abcad8b348-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.654655 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/810e51fd-c7d4-4d56-9baf-a1abcad8b348-config-data\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.688455 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwbgm\" (UniqueName: \"kubernetes.io/projected/810e51fd-c7d4-4d56-9baf-a1abcad8b348-kube-api-access-dwbgm\") pod \"nova-scheduler-0\" (UID: \"810e51fd-c7d4-4d56-9baf-a1abcad8b348\") " pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.861349 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6120d0cd-f573-4f9c-af92-77580df0916c-logs\") pod \"6120d0cd-f573-4f9c-af92-77580df0916c\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.861411 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-combined-ca-bundle\") pod \"6120d0cd-f573-4f9c-af92-77580df0916c\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.861445 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhpvm\" (UniqueName: \"kubernetes.io/projected/6120d0cd-f573-4f9c-af92-77580df0916c-kube-api-access-qhpvm\") pod \"6120d0cd-f573-4f9c-af92-77580df0916c\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.861691 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-config-data\") pod \"6120d0cd-f573-4f9c-af92-77580df0916c\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.861805 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-nova-metadata-tls-certs\") pod \"6120d0cd-f573-4f9c-af92-77580df0916c\" (UID: \"6120d0cd-f573-4f9c-af92-77580df0916c\") " Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.863234 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6120d0cd-f573-4f9c-af92-77580df0916c-logs" (OuterVolumeSpecName: "logs") pod "6120d0cd-f573-4f9c-af92-77580df0916c" (UID: "6120d0cd-f573-4f9c-af92-77580df0916c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.873947 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6120d0cd-f573-4f9c-af92-77580df0916c-kube-api-access-qhpvm" (OuterVolumeSpecName: "kube-api-access-qhpvm") pod "6120d0cd-f573-4f9c-af92-77580df0916c" (UID: "6120d0cd-f573-4f9c-af92-77580df0916c"). InnerVolumeSpecName "kube-api-access-qhpvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.933105 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.976673 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6120d0cd-f573-4f9c-af92-77580df0916c-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:15 crc kubenswrapper[4702]: I1203 11:34:15.976915 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhpvm\" (UniqueName: \"kubernetes.io/projected/6120d0cd-f573-4f9c-af92-77580df0916c-kube-api-access-qhpvm\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.005935 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6120d0cd-f573-4f9c-af92-77580df0916c" (UID: "6120d0cd-f573-4f9c-af92-77580df0916c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.014955 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-config-data" (OuterVolumeSpecName: "config-data") pod "6120d0cd-f573-4f9c-af92-77580df0916c" (UID: "6120d0cd-f573-4f9c-af92-77580df0916c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.071422 4702 generic.go:334] "Generic (PLEG): container finished" podID="6120d0cd-f573-4f9c-af92-77580df0916c" containerID="0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023" exitCode=0 Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.071518 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6120d0cd-f573-4f9c-af92-77580df0916c","Type":"ContainerDied","Data":"0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023"} Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.071558 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6120d0cd-f573-4f9c-af92-77580df0916c","Type":"ContainerDied","Data":"6916950e477fecf3db8ebfdce67d78c80246c6681772a6f1dbbbf88102636c34"} Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.071579 4702 scope.go:117] "RemoveContainer" containerID="0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.071783 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.078747 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.079674 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.083845 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6120d0cd-f573-4f9c-af92-77580df0916c" (UID: "6120d0cd-f573-4f9c-af92-77580df0916c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.104548 4702 scope.go:117] "RemoveContainer" containerID="8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.130169 4702 scope.go:117] "RemoveContainer" containerID="0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023" Dec 03 11:34:16 crc kubenswrapper[4702]: E1203 11:34:16.130813 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023\": container with ID starting with 0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023 not found: ID does not exist" containerID="0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.130855 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023"} err="failed to get container status \"0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023\": rpc error: code = NotFound desc = could not find container \"0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023\": container with ID starting with 0a5f8c86f21f7382c9cfda399bb4e9ed6d6373d14edf307129a09f7ddfb05023 not found: ID does not exist" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.130888 4702 scope.go:117] "RemoveContainer" containerID="8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df" Dec 03 11:34:16 crc kubenswrapper[4702]: E1203 11:34:16.131363 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df\": container with ID starting with 8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df not found: ID does not exist" containerID="8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.131383 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df"} err="failed to get container status \"8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df\": rpc error: code = NotFound desc = could not find container \"8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df\": container with ID starting with 8e3f3f4f83fc88492678fdf7759126e99646de8636e9965621cd1c930f76b3df not found: ID does not exist" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.182929 4702 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6120d0cd-f573-4f9c-af92-77580df0916c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.439824 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.451721 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.466161 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:34:16 crc kubenswrapper[4702]: E1203 11:34:16.466776 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-metadata" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.466798 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-metadata" Dec 03 11:34:16 crc kubenswrapper[4702]: E1203 11:34:16.466848 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-log" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.467166 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-log" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.467465 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-log" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.467489 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" containerName="nova-metadata-metadata" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.469208 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.474718 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.475037 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.479588 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.496777 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef79cba-d523-4825-9524-26f53553b618-logs\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.496882 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dlf4\" (UniqueName: \"kubernetes.io/projected/eef79cba-d523-4825-9524-26f53553b618-kube-api-access-7dlf4\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.497627 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.499058 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.499388 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-config-data\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.962688 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.962725 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.962966 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-config-data\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.963042 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef79cba-d523-4825-9524-26f53553b618-logs\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.963099 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dlf4\" (UniqueName: \"kubernetes.io/projected/eef79cba-d523-4825-9524-26f53553b618-kube-api-access-7dlf4\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.968304 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eef79cba-d523-4825-9524-26f53553b618-logs\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.981231 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-config-data\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:16 crc kubenswrapper[4702]: I1203 11:34:16.989066 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.002618 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dlf4\" (UniqueName: \"kubernetes.io/projected/eef79cba-d523-4825-9524-26f53553b618-kube-api-access-7dlf4\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.004938 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef79cba-d523-4825-9524-26f53553b618-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"eef79cba-d523-4825-9524-26f53553b618\") " pod="openstack/nova-metadata-0" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.009302 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49564875-fe0f-45b7-b72f-f9583f4a80be" path="/var/lib/kubelet/pods/49564875-fe0f-45b7-b72f-f9583f4a80be/volumes" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.010049 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6120d0cd-f573-4f9c-af92-77580df0916c" path="/var/lib/kubelet/pods/6120d0cd-f573-4f9c-af92-77580df0916c/volumes" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.071293 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.099235 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.119319 4702 generic.go:334] "Generic (PLEG): container finished" podID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerID="ceb64211b8aa47bffdd4015e1b443e622391819cab8943a0c92f10aea9746a81" exitCode=0 Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.119379 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e36431e8-e244-4c9f-92d4-df351ca9157a","Type":"ContainerDied","Data":"ceb64211b8aa47bffdd4015e1b443e622391819cab8943a0c92f10aea9746a81"} Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.121319 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"810e51fd-c7d4-4d56-9baf-a1abcad8b348","Type":"ContainerStarted","Data":"7f2f1a2a4eadc0aa45d10f35f229fe4d685703a417306a43c7b2c7e405323edb"} Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.352446 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.488356 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-internal-tls-certs\") pod \"e36431e8-e244-4c9f-92d4-df351ca9157a\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.489098 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-public-tls-certs\") pod \"e36431e8-e244-4c9f-92d4-df351ca9157a\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.489161 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36431e8-e244-4c9f-92d4-df351ca9157a-logs\") pod \"e36431e8-e244-4c9f-92d4-df351ca9157a\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.489193 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs88x\" (UniqueName: \"kubernetes.io/projected/e36431e8-e244-4c9f-92d4-df351ca9157a-kube-api-access-hs88x\") pod \"e36431e8-e244-4c9f-92d4-df351ca9157a\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.489239 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-combined-ca-bundle\") pod \"e36431e8-e244-4c9f-92d4-df351ca9157a\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.489433 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-config-data\") pod \"e36431e8-e244-4c9f-92d4-df351ca9157a\" (UID: \"e36431e8-e244-4c9f-92d4-df351ca9157a\") " Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.492209 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e36431e8-e244-4c9f-92d4-df351ca9157a-logs" (OuterVolumeSpecName: "logs") pod "e36431e8-e244-4c9f-92d4-df351ca9157a" (UID: "e36431e8-e244-4c9f-92d4-df351ca9157a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.496116 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36431e8-e244-4c9f-92d4-df351ca9157a-kube-api-access-hs88x" (OuterVolumeSpecName: "kube-api-access-hs88x") pod "e36431e8-e244-4c9f-92d4-df351ca9157a" (UID: "e36431e8-e244-4c9f-92d4-df351ca9157a"). InnerVolumeSpecName "kube-api-access-hs88x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.530587 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e36431e8-e244-4c9f-92d4-df351ca9157a" (UID: "e36431e8-e244-4c9f-92d4-df351ca9157a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.531522 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-config-data" (OuterVolumeSpecName: "config-data") pod "e36431e8-e244-4c9f-92d4-df351ca9157a" (UID: "e36431e8-e244-4c9f-92d4-df351ca9157a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.583197 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e36431e8-e244-4c9f-92d4-df351ca9157a" (UID: "e36431e8-e244-4c9f-92d4-df351ca9157a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.594256 4702 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e36431e8-e244-4c9f-92d4-df351ca9157a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.594299 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs88x\" (UniqueName: \"kubernetes.io/projected/e36431e8-e244-4c9f-92d4-df351ca9157a-kube-api-access-hs88x\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.594316 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.594328 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.594340 4702 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.595284 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e36431e8-e244-4c9f-92d4-df351ca9157a" (UID: "e36431e8-e244-4c9f-92d4-df351ca9157a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.696699 4702 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e36431e8-e244-4c9f-92d4-df351ca9157a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:34:17 crc kubenswrapper[4702]: I1203 11:34:17.773310 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.149621 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e36431e8-e244-4c9f-92d4-df351ca9157a","Type":"ContainerDied","Data":"4a06719f8dec31549a2faad3fb04073c7b69dd53f902bb14541180a6f2dd0eab"} Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.149695 4702 scope.go:117] "RemoveContainer" containerID="ceb64211b8aa47bffdd4015e1b443e622391819cab8943a0c92f10aea9746a81" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.150033 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.156471 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"810e51fd-c7d4-4d56-9baf-a1abcad8b348","Type":"ContainerStarted","Data":"213e903ed27110aef226ce11748fa181e1a022807a9c5a477f7c8344a2391ae1"} Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.163473 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eef79cba-d523-4825-9524-26f53553b618","Type":"ContainerStarted","Data":"9f28e90184cde6b8ba799371ebdb9b55a8e3e1d45f5c6cb54657d543a2f19609"} Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.163559 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eef79cba-d523-4825-9524-26f53553b618","Type":"ContainerStarted","Data":"543c1ae8d104ccc0fcd447e8efb264219d7d753a02cd5923355ff5a4d3806db8"} Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.194080 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.194054248 podStartE2EDuration="3.194054248s" podCreationTimestamp="2025-12-03 11:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:34:18.182500499 +0000 UTC m=+1842.018428963" watchObservedRunningTime="2025-12-03 11:34:18.194054248 +0000 UTC m=+1842.029982712" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.214335 4702 scope.go:117] "RemoveContainer" containerID="579978377ef619f15f3295160d671c49b9c10b3f1949e28bb55c6f8d20b40691" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.320959 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.826071 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.873374 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 11:34:18 crc kubenswrapper[4702]: E1203 11:34:18.874142 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-api" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.874166 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-api" Dec 03 11:34:18 crc kubenswrapper[4702]: E1203 11:34:18.874200 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-log" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.874207 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-log" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.874499 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-api" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.874524 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" containerName="nova-api-log" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.876880 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.882091 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.882287 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.882353 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.886451 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:34:18 crc kubenswrapper[4702]: I1203 11:34:18.943583 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e36431e8-e244-4c9f-92d4-df351ca9157a" path="/var/lib/kubelet/pods/e36431e8-e244-4c9f-92d4-df351ca9157a/volumes" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.019833 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-public-tls-certs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.020366 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.020605 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kgst\" (UniqueName: \"kubernetes.io/projected/7455976c-e312-4b2a-963f-6e75d428c41c-kube-api-access-6kgst\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.021147 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.021189 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-config-data\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.021394 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7455976c-e312-4b2a-963f-6e75d428c41c-logs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.123274 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7455976c-e312-4b2a-963f-6e75d428c41c-logs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.123485 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-public-tls-certs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.123577 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.123631 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kgst\" (UniqueName: \"kubernetes.io/projected/7455976c-e312-4b2a-963f-6e75d428c41c-kube-api-access-6kgst\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.123678 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.123732 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-config-data\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.124656 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7455976c-e312-4b2a-963f-6e75d428c41c-logs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.129028 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-internal-tls-certs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.129413 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-config-data\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.130200 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-public-tls-certs\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.140686 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7455976c-e312-4b2a-963f-6e75d428c41c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.148717 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kgst\" (UniqueName: \"kubernetes.io/projected/7455976c-e312-4b2a-963f-6e75d428c41c-kube-api-access-6kgst\") pod \"nova-api-0\" (UID: \"7455976c-e312-4b2a-963f-6e75d428c41c\") " pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.199854 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"eef79cba-d523-4825-9524-26f53553b618","Type":"ContainerStarted","Data":"51c9024ad8170b3a111d5f11769c537f0b639f7c1a243d215a1393ef7848f2d0"} Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.205741 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 11:34:19 crc kubenswrapper[4702]: I1203 11:34:19.234257 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.234234076 podStartE2EDuration="3.234234076s" podCreationTimestamp="2025-12-03 11:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:34:19.223918032 +0000 UTC m=+1843.059846496" watchObservedRunningTime="2025-12-03 11:34:19.234234076 +0000 UTC m=+1843.070162540" Dec 03 11:34:20 crc kubenswrapper[4702]: I1203 11:34:20.140818 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 11:34:20 crc kubenswrapper[4702]: W1203 11:34:20.149295 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455976c_e312_4b2a_963f_6e75d428c41c.slice/crio-ebed6c2b04c63239e67a8a4f0f55c5da0795287a81225115c6f61f24eececb7c WatchSource:0}: Error finding container ebed6c2b04c63239e67a8a4f0f55c5da0795287a81225115c6f61f24eececb7c: Status 404 returned error can't find the container with id ebed6c2b04c63239e67a8a4f0f55c5da0795287a81225115c6f61f24eececb7c Dec 03 11:34:20 crc kubenswrapper[4702]: I1203 11:34:20.214672 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7455976c-e312-4b2a-963f-6e75d428c41c","Type":"ContainerStarted","Data":"ebed6c2b04c63239e67a8a4f0f55c5da0795287a81225115c6f61f24eececb7c"} Dec 03 11:34:21 crc kubenswrapper[4702]: I1203 11:34:21.227139 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 11:34:22 crc kubenswrapper[4702]: I1203 11:34:22.180037 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 11:34:22 crc kubenswrapper[4702]: I1203 11:34:22.180412 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 11:34:22 crc kubenswrapper[4702]: I1203 11:34:22.247811 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7455976c-e312-4b2a-963f-6e75d428c41c","Type":"ContainerStarted","Data":"d4829193705359cd1676ce79a2c7cd6aed67325069f1da2aac6eb52e74a06632"} Dec 03 11:34:22 crc kubenswrapper[4702]: I1203 11:34:22.248167 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7455976c-e312-4b2a-963f-6e75d428c41c","Type":"ContainerStarted","Data":"ea0d5ca59edab0195d1598af0b5edb670d076f545c254f4f3729aa32bcb1e428"} Dec 03 11:34:22 crc kubenswrapper[4702]: I1203 11:34:22.291201 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.291147315 podStartE2EDuration="4.291147315s" podCreationTimestamp="2025-12-03 11:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:34:22.268608523 +0000 UTC m=+1846.104536987" watchObservedRunningTime="2025-12-03 11:34:22.291147315 +0000 UTC m=+1846.127075779" Dec 03 11:34:25 crc kubenswrapper[4702]: I1203 11:34:25.929215 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:34:25 crc kubenswrapper[4702]: E1203 11:34:25.930235 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:34:25 crc kubenswrapper[4702]: I1203 11:34:25.936322 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 11:34:25 crc kubenswrapper[4702]: I1203 11:34:25.970454 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 11:34:26 crc kubenswrapper[4702]: I1203 11:34:26.326540 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 11:34:27 crc kubenswrapper[4702]: I1203 11:34:27.102094 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 11:34:27 crc kubenswrapper[4702]: I1203 11:34:27.103625 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 11:34:28 crc kubenswrapper[4702]: I1203 11:34:28.121054 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eef79cba-d523-4825-9524-26f53553b618" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 11:34:28 crc kubenswrapper[4702]: I1203 11:34:28.121270 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="eef79cba-d523-4825-9524-26f53553b618" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 11:34:28 crc kubenswrapper[4702]: I1203 11:34:28.317195 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 11:34:29 crc kubenswrapper[4702]: I1203 11:34:29.206271 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 11:34:29 crc kubenswrapper[4702]: I1203 11:34:29.206342 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 11:34:30 crc kubenswrapper[4702]: I1203 11:34:30.218998 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7455976c-e312-4b2a-963f-6e75d428c41c" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.7:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 11:34:30 crc kubenswrapper[4702]: I1203 11:34:30.219023 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7455976c-e312-4b2a-963f-6e75d428c41c" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.7:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 11:34:37 crc kubenswrapper[4702]: I1203 11:34:37.108049 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 11:34:37 crc kubenswrapper[4702]: I1203 11:34:37.108695 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 11:34:37 crc kubenswrapper[4702]: I1203 11:34:37.114698 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 11:34:37 crc kubenswrapper[4702]: I1203 11:34:37.115113 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 11:34:39 crc kubenswrapper[4702]: I1203 11:34:39.214216 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 11:34:39 crc kubenswrapper[4702]: I1203 11:34:39.214840 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 11:34:39 crc kubenswrapper[4702]: I1203 11:34:39.218342 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 11:34:39 crc kubenswrapper[4702]: I1203 11:34:39.219584 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 11:34:39 crc kubenswrapper[4702]: I1203 11:34:39.452801 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 11:34:39 crc kubenswrapper[4702]: I1203 11:34:39.461094 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 11:34:39 crc kubenswrapper[4702]: I1203 11:34:39.928553 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:34:39 crc kubenswrapper[4702]: E1203 11:34:39.928929 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.622423 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-wjjt9"] Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.635722 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-wjjt9"] Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.714972 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-gq8tr"] Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.717299 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.729548 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-gq8tr"] Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.749705 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-config-data\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.749812 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-combined-ca-bundle\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.749880 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scwrv\" (UniqueName: \"kubernetes.io/projected/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-kube-api-access-scwrv\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.899444 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-config-data\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.899653 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-combined-ca-bundle\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.899820 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scwrv\" (UniqueName: \"kubernetes.io/projected/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-kube-api-access-scwrv\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.911747 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-combined-ca-bundle\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.922830 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-config-data\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.938358 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scwrv\" (UniqueName: \"kubernetes.io/projected/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-kube-api-access-scwrv\") pod \"heat-db-sync-gq8tr\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:50 crc kubenswrapper[4702]: I1203 11:34:50.992671 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c60c306-2c56-44e4-8482-e5a72eccd765" path="/var/lib/kubelet/pods/6c60c306-2c56-44e4-8482-e5a72eccd765/volumes" Dec 03 11:34:51 crc kubenswrapper[4702]: I1203 11:34:51.053393 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-gq8tr" Dec 03 11:34:51 crc kubenswrapper[4702]: I1203 11:34:51.559639 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-gq8tr"] Dec 03 11:34:51 crc kubenswrapper[4702]: I1203 11:34:51.972817 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-gq8tr" event={"ID":"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54","Type":"ContainerStarted","Data":"cf7859403e9b77fbc6b05044e5b53b2b658b958a3f5d6a634ad3dbee3cb03e89"} Dec 03 11:34:52 crc kubenswrapper[4702]: I1203 11:34:52.138431 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:34:53 crc kubenswrapper[4702]: I1203 11:34:53.769270 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:34:54 crc kubenswrapper[4702]: I1203 11:34:54.186405 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:34:54 crc kubenswrapper[4702]: I1203 11:34:54.186743 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="ceilometer-central-agent" containerID="cri-o://6b161e8cfcbcec9fcf2d70af85019391d37b62c61afcd6edb67b11f5cf9b795a" gracePeriod=30 Dec 03 11:34:54 crc kubenswrapper[4702]: I1203 11:34:54.187274 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="proxy-httpd" containerID="cri-o://10b6764393e624d2c417eb58f663001da9509c823d90e97a17e2dbf7658fd428" gracePeriod=30 Dec 03 11:34:54 crc kubenswrapper[4702]: I1203 11:34:54.187324 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="sg-core" containerID="cri-o://c61e0bdeacef5ab9d3748b3eddaff5993c6ebd0d7e0c4d83697ffb90a320f86d" gracePeriod=30 Dec 03 11:34:54 crc kubenswrapper[4702]: I1203 11:34:54.187361 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="ceilometer-notification-agent" containerID="cri-o://e60b69678a5459e7554a2e3a9ea25afa31fe05bafcdef2b429ff0761434b8dec" gracePeriod=30 Dec 03 11:34:54 crc kubenswrapper[4702]: I1203 11:34:54.932415 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:34:54 crc kubenswrapper[4702]: E1203 11:34:54.940843 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:34:55 crc kubenswrapper[4702]: I1203 11:34:55.195349 4702 generic.go:334] "Generic (PLEG): container finished" podID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerID="10b6764393e624d2c417eb58f663001da9509c823d90e97a17e2dbf7658fd428" exitCode=0 Dec 03 11:34:55 crc kubenswrapper[4702]: I1203 11:34:55.195666 4702 generic.go:334] "Generic (PLEG): container finished" podID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerID="c61e0bdeacef5ab9d3748b3eddaff5993c6ebd0d7e0c4d83697ffb90a320f86d" exitCode=2 Dec 03 11:34:55 crc kubenswrapper[4702]: I1203 11:34:55.195681 4702 generic.go:334] "Generic (PLEG): container finished" podID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerID="6b161e8cfcbcec9fcf2d70af85019391d37b62c61afcd6edb67b11f5cf9b795a" exitCode=0 Dec 03 11:34:55 crc kubenswrapper[4702]: I1203 11:34:55.195712 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerDied","Data":"10b6764393e624d2c417eb58f663001da9509c823d90e97a17e2dbf7658fd428"} Dec 03 11:34:55 crc kubenswrapper[4702]: I1203 11:34:55.195742 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerDied","Data":"c61e0bdeacef5ab9d3748b3eddaff5993c6ebd0d7e0c4d83697ffb90a320f86d"} Dec 03 11:34:55 crc kubenswrapper[4702]: I1203 11:34:55.195751 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerDied","Data":"6b161e8cfcbcec9fcf2d70af85019391d37b62c61afcd6edb67b11f5cf9b795a"} Dec 03 11:34:55 crc kubenswrapper[4702]: I1203 11:34:55.606193 4702 scope.go:117] "RemoveContainer" containerID="8a4ee33f948589467c0af31fc62697afcbea2ea853b560f7e022894353a8c1c2" Dec 03 11:34:55 crc kubenswrapper[4702]: I1203 11:34:55.682958 4702 scope.go:117] "RemoveContainer" containerID="d4b436d57fea09c21957601613ce4d8daa58090505dc302743cb7d66267f683d" Dec 03 11:34:58 crc kubenswrapper[4702]: I1203 11:34:58.327843 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.4:3000/\": dial tcp 10.217.1.4:3000: connect: connection refused" Dec 03 11:34:58 crc kubenswrapper[4702]: I1203 11:34:58.376511 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerName="rabbitmq" containerID="cri-o://ede35561174b293853d63b2d4a6b0b2d7cbdbdf59506eb2c648bf64fa62ebb67" gracePeriod=604794 Dec 03 11:34:58 crc kubenswrapper[4702]: I1203 11:34:58.928296 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Dec 03 11:34:59 crc kubenswrapper[4702]: I1203 11:34:59.259564 4702 generic.go:334] "Generic (PLEG): container finished" podID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerID="e60b69678a5459e7554a2e3a9ea25afa31fe05bafcdef2b429ff0761434b8dec" exitCode=0 Dec 03 11:34:59 crc kubenswrapper[4702]: I1203 11:34:59.259611 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerDied","Data":"e60b69678a5459e7554a2e3a9ea25afa31fe05bafcdef2b429ff0761434b8dec"} Dec 03 11:34:59 crc kubenswrapper[4702]: I1203 11:34:59.413519 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerName="rabbitmq" containerID="cri-o://8f3a164c405e3bcb963ecdafe18b9c7d08e2a67a14287fd0dd6a702abcc9f3d8" gracePeriod=604795 Dec 03 11:34:59 crc kubenswrapper[4702]: I1203 11:34:59.526962 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: connect: connection refused" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.751453 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.813981 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-scripts\") pod \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.814796 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-log-httpd\") pod \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.814917 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-combined-ca-bundle\") pod \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.815068 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg2cb\" (UniqueName: \"kubernetes.io/projected/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-kube-api-access-rg2cb\") pod \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.815174 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-sg-core-conf-yaml\") pod \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.815250 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-config-data\") pod \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.815339 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-ceilometer-tls-certs\") pod \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.815802 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-run-httpd\") pod \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\" (UID: \"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51\") " Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.815359 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" (UID: "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.816906 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" (UID: "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.825005 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-scripts" (OuterVolumeSpecName: "scripts") pod "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" (UID: "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.839464 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-kube-api-access-rg2cb" (OuterVolumeSpecName: "kube-api-access-rg2cb") pod "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" (UID: "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51"). InnerVolumeSpecName "kube-api-access-rg2cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.897607 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" (UID: "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.919620 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.919658 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.919675 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg2cb\" (UniqueName: \"kubernetes.io/projected/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-kube-api-access-rg2cb\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.919689 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.919705 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:05 crc kubenswrapper[4702]: I1203 11:35:05.956325 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" (UID: "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.009503 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" (UID: "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.022409 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.022456 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.074493 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-config-data" (OuterVolumeSpecName: "config-data") pod "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" (UID: "aa73e1e2-6ead-4a19-a9a7-446f3dce5e51"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.125839 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.387262 4702 generic.go:334] "Generic (PLEG): container finished" podID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerID="ede35561174b293853d63b2d4a6b0b2d7cbdbdf59506eb2c648bf64fa62ebb67" exitCode=0 Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.387385 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85f53e1b-50d1-4249-ba44-5b2e5982ae36","Type":"ContainerDied","Data":"ede35561174b293853d63b2d4a6b0b2d7cbdbdf59506eb2c648bf64fa62ebb67"} Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.391996 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"aa73e1e2-6ead-4a19-a9a7-446f3dce5e51","Type":"ContainerDied","Data":"bff9e5c8ba29fc68244c72be26afdc8d3528588ec4a8483b501a77236d2b6b33"} Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.392076 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.392128 4702 scope.go:117] "RemoveContainer" containerID="10b6764393e624d2c417eb58f663001da9509c823d90e97a17e2dbf7658fd428" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.447618 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.469964 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.483634 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:35:06 crc kubenswrapper[4702]: E1203 11:35:06.484209 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="sg-core" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.484228 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="sg-core" Dec 03 11:35:06 crc kubenswrapper[4702]: E1203 11:35:06.484271 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="proxy-httpd" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.484278 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="proxy-httpd" Dec 03 11:35:06 crc kubenswrapper[4702]: E1203 11:35:06.484301 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="ceilometer-central-agent" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.484307 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="ceilometer-central-agent" Dec 03 11:35:06 crc kubenswrapper[4702]: E1203 11:35:06.484326 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="ceilometer-notification-agent" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.484332 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="ceilometer-notification-agent" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.484571 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="ceilometer-notification-agent" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.484593 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="proxy-httpd" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.484616 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="sg-core" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.484629 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" containerName="ceilometer-central-agent" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.488291 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.491502 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.491798 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.491959 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.507905 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.670814 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-scripts\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.671089 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-config-data\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.671152 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-run-httpd\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.671274 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-log-httpd\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.671332 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl745\" (UniqueName: \"kubernetes.io/projected/4dd6e551-e7d9-4f55-a878-bd36db9707e8-kube-api-access-zl745\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.671371 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.671410 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.671571 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.774397 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.774971 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-scripts\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.775088 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-config-data\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.775133 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-run-httpd\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.775203 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-log-httpd\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.775236 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl745\" (UniqueName: \"kubernetes.io/projected/4dd6e551-e7d9-4f55-a878-bd36db9707e8-kube-api-access-zl745\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.775288 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.775314 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.775747 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-run-httpd\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.775821 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-log-httpd\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.780612 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.781107 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.786973 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-scripts\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.787199 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.787829 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-config-data\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.794722 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl745\" (UniqueName: \"kubernetes.io/projected/4dd6e551-e7d9-4f55-a878-bd36db9707e8-kube-api-access-zl745\") pod \"ceilometer-0\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.817861 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 11:35:06 crc kubenswrapper[4702]: I1203 11:35:06.945361 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa73e1e2-6ead-4a19-a9a7-446f3dce5e51" path="/var/lib/kubelet/pods/aa73e1e2-6ead-4a19-a9a7-446f3dce5e51/volumes" Dec 03 11:35:07 crc kubenswrapper[4702]: I1203 11:35:07.419453 4702 generic.go:334] "Generic (PLEG): container finished" podID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerID="8f3a164c405e3bcb963ecdafe18b9c7d08e2a67a14287fd0dd6a702abcc9f3d8" exitCode=0 Dec 03 11:35:07 crc kubenswrapper[4702]: I1203 11:35:07.419502 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c","Type":"ContainerDied","Data":"8f3a164c405e3bcb963ecdafe18b9c7d08e2a67a14287fd0dd6a702abcc9f3d8"} Dec 03 11:35:08 crc kubenswrapper[4702]: I1203 11:35:08.927829 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.128:5671: connect: connection refused" Dec 03 11:35:09 crc kubenswrapper[4702]: I1203 11:35:09.928145 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:35:09 crc kubenswrapper[4702]: E1203 11:35:09.929095 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.599877 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-7vwqc"] Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.605663 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.609979 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.633019 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-7vwqc"] Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.772916 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.773081 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.773121 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.773338 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.773478 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.773847 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmsf2\" (UniqueName: \"kubernetes.io/projected/a47f12ae-2287-4da5-b66d-4f8cd64abd69-kube-api-access-vmsf2\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.774144 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-config\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.876867 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.876934 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.877013 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.877064 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.877090 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmsf2\" (UniqueName: \"kubernetes.io/projected/a47f12ae-2287-4da5-b66d-4f8cd64abd69-kube-api-access-vmsf2\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.877113 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-config\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.877303 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.878455 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.878607 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.879040 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.879271 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-config\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.879356 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:10 crc kubenswrapper[4702]: I1203 11:35:10.879847 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:11 crc kubenswrapper[4702]: I1203 11:35:11.049196 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmsf2\" (UniqueName: \"kubernetes.io/projected/a47f12ae-2287-4da5-b66d-4f8cd64abd69-kube-api-access-vmsf2\") pod \"dnsmasq-dns-7d84b4d45c-7vwqc\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:11 crc kubenswrapper[4702]: I1203 11:35:11.261305 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:14 crc kubenswrapper[4702]: I1203 11:35:14.526984 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.129:5671: i/o timeout" Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.706513 4702 scope.go:117] "RemoveContainer" containerID="c61e0bdeacef5ab9d3748b3eddaff5993c6ebd0d7e0c4d83697ffb90a320f86d" Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.897116 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.918357 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997583 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-plugins-conf\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997643 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-config-data\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997679 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-kube-api-access-2jfm9\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997722 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-confd\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997779 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-erlang-cookie-secret\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997834 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-erlang-cookie\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997869 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-erlang-cookie\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997892 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-config-data\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.997951 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85f53e1b-50d1-4249-ba44-5b2e5982ae36-pod-info\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998010 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-tls\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998046 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-plugins\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998122 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-tls\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998196 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-plugins\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998224 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998245 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-plugins-conf\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998267 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-confd\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998288 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998315 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-server-conf\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998348 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m265r\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-kube-api-access-m265r\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998392 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-pod-info\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998426 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-server-conf\") pod \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\" (UID: \"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c\") " Dec 03 11:35:16 crc kubenswrapper[4702]: I1203 11:35:16.998463 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85f53e1b-50d1-4249-ba44-5b2e5982ae36-erlang-cookie-secret\") pod \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\" (UID: \"85f53e1b-50d1-4249-ba44-5b2e5982ae36\") " Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.000347 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.000856 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.002997 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.017525 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.025646 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.035674 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.046442 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-pod-info" (OuterVolumeSpecName: "pod-info") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.049444 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.049550 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85f53e1b-50d1-4249-ba44-5b2e5982ae36-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.050093 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/85f53e1b-50d1-4249-ba44-5b2e5982ae36-pod-info" (OuterVolumeSpecName: "pod-info") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.050608 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.050772 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.070914 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.080921 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.081695 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-kube-api-access-2jfm9" (OuterVolumeSpecName: "kube-api-access-2jfm9") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "kube-api-access-2jfm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.091675 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-kube-api-access-m265r" (OuterVolumeSpecName: "kube-api-access-m265r") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "kube-api-access-m265r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102330 4702 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/85f53e1b-50d1-4249-ba44-5b2e5982ae36-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102367 4702 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102378 4702 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102386 4702 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102395 4702 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102418 4702 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102426 4702 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102440 4702 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102449 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m265r\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-kube-api-access-m265r\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102459 4702 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102469 4702 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/85f53e1b-50d1-4249-ba44-5b2e5982ae36-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102478 4702 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102487 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-kube-api-access-2jfm9\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102495 4702 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102504 4702 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.102515 4702 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.125962 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-config-data" (OuterVolumeSpecName: "config-data") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.139580 4702 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.145944 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-config-data" (OuterVolumeSpecName: "config-data") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.180260 4702 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.221086 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-server-conf" (OuterVolumeSpecName: "server-conf") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.239559 4702 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.239614 4702 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.239638 4702 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.239654 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.239667 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.312292 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-server-conf" (OuterVolumeSpecName: "server-conf") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.369086 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "85f53e1b-50d1-4249-ba44-5b2e5982ae36" (UID: "85f53e1b-50d1-4249-ba44-5b2e5982ae36"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.374841 4702 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/85f53e1b-50d1-4249-ba44-5b2e5982ae36-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.374887 4702 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/85f53e1b-50d1-4249-ba44-5b2e5982ae36-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.592317 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" (UID: "1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.624997 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c","Type":"ContainerDied","Data":"ed0628de87c8aa18d782d459a2c06c2ae5cce05314fbcb1a857c3cacea0b55dd"} Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.625028 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.633885 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"85f53e1b-50d1-4249-ba44-5b2e5982ae36","Type":"ContainerDied","Data":"6b236d9838f722c8db01c3541bac607ddcf4d6120c77c7a90e3c770958beabef"} Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.634002 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.683390 4702 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.683448 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.724947 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.747657 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.770232 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.780420 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:35:17 crc kubenswrapper[4702]: E1203 11:35:17.781141 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerName="rabbitmq" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.781172 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerName="rabbitmq" Dec 03 11:35:17 crc kubenswrapper[4702]: E1203 11:35:17.781210 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerName="rabbitmq" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.781218 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerName="rabbitmq" Dec 03 11:35:17 crc kubenswrapper[4702]: E1203 11:35:17.781272 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerName="setup-container" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.781281 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerName="setup-container" Dec 03 11:35:17 crc kubenswrapper[4702]: E1203 11:35:17.781292 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerName="setup-container" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.781301 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerName="setup-container" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.781614 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" containerName="rabbitmq" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.781653 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" containerName="rabbitmq" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.784408 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.788106 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.788423 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.788692 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.788853 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.789033 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mq44d" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.789148 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.791345 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.812969 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.815474 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.819278 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.819426 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.819549 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-wmbwc" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.819657 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.819750 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.819993 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.820093 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.829500 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.847216 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.887317 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.888354 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/33f03183-33e1-4aa1-8a4c-11f8b75297cd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.888458 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.888624 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.888714 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdb24\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-kube-api-access-pdb24\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.888831 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bda57dc3-3be8-4feb-a987-62c1412de0ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.888946 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889039 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889270 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889372 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889459 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp9m8\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-kube-api-access-tp9m8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889565 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889644 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889736 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-config-data\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889851 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.889993 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bda57dc3-3be8-4feb-a987-62c1412de0ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.890122 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.890218 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/33f03183-33e1-4aa1-8a4c-11f8b75297cd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.890300 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.890407 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.890547 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.890665 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: E1203 11:35:17.982956 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 03 11:35:17 crc kubenswrapper[4702]: E1203 11:35:17.983081 4702 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Dec 03 11:35:17 crc kubenswrapper[4702]: E1203 11:35:17.983321 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-scwrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-gq8tr_openstack(8cf7ffe2-0a74-42cd-ab56-1a65d1317f54): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 11:35:17 crc kubenswrapper[4702]: E1203 11:35:17.984652 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-gq8tr" podUID="8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.992723 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/33f03183-33e1-4aa1-8a4c-11f8b75297cd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.992787 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.992891 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.992913 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdb24\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-kube-api-access-pdb24\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.992951 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bda57dc3-3be8-4feb-a987-62c1412de0ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.992973 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.992989 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993006 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993042 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993059 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp9m8\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-kube-api-access-tp9m8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993081 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993098 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993131 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-config-data\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993149 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993180 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bda57dc3-3be8-4feb-a987-62c1412de0ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993214 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993242 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/33f03183-33e1-4aa1-8a4c-11f8b75297cd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993260 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993309 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993344 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993368 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.993452 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.994924 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.995178 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.995436 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.995700 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.996673 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.998011 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.998621 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.998957 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.999049 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-server-conf\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:17 crc kubenswrapper[4702]: I1203 11:35:17.999247 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.001052 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/33f03183-33e1-4aa1-8a4c-11f8b75297cd-config-data\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.001252 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bda57dc3-3be8-4feb-a987-62c1412de0ad-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.003460 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/33f03183-33e1-4aa1-8a4c-11f8b75297cd-pod-info\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.005708 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.006573 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.006687 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.007098 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bda57dc3-3be8-4feb-a987-62c1412de0ad-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.010324 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.014772 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bda57dc3-3be8-4feb-a987-62c1412de0ad-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.019737 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp9m8\" (UniqueName: \"kubernetes.io/projected/bda57dc3-3be8-4feb-a987-62c1412de0ad-kube-api-access-tp9m8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.031221 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdb24\" (UniqueName: \"kubernetes.io/projected/33f03183-33e1-4aa1-8a4c-11f8b75297cd-kube-api-access-pdb24\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.038346 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/33f03183-33e1-4aa1-8a4c-11f8b75297cd-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.072669 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"33f03183-33e1-4aa1-8a4c-11f8b75297cd\") " pod="openstack/rabbitmq-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.083365 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bda57dc3-3be8-4feb-a987-62c1412de0ad\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.115322 4702 scope.go:117] "RemoveContainer" containerID="e60b69678a5459e7554a2e3a9ea25afa31fe05bafcdef2b429ff0761434b8dec" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.154026 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.169049 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.268066 4702 scope.go:117] "RemoveContainer" containerID="6b161e8cfcbcec9fcf2d70af85019391d37b62c61afcd6edb67b11f5cf9b795a" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.324936 4702 scope.go:117] "RemoveContainer" containerID="8f3a164c405e3bcb963ecdafe18b9c7d08e2a67a14287fd0dd6a702abcc9f3d8" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.387617 4702 scope.go:117] "RemoveContainer" containerID="eebb24be9f7e8f7283478723e52c16d5a5d11bc79614ec5ce8495c886d468846" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.429606 4702 scope.go:117] "RemoveContainer" containerID="ede35561174b293853d63b2d4a6b0b2d7cbdbdf59506eb2c648bf64fa62ebb67" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.452290 4702 scope.go:117] "RemoveContainer" containerID="d163d9269091867cae17f00eb5509c72849aa1f770ab1a3c24d29e4611f08a54" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.517301 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.664368 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerStarted","Data":"ca04d485f2e2186c6531a3d7e6ae9986c38aa52af476ee089a6013a9a803ddd3"} Dec 03 11:35:18 crc kubenswrapper[4702]: E1203 11:35:18.672427 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-gq8tr" podUID="8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.706320 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-7vwqc"] Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.849349 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 11:35:18 crc kubenswrapper[4702]: W1203 11:35:18.863392 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f03183_33e1_4aa1_8a4c_11f8b75297cd.slice/crio-68dcf59b6728afdbf98ec1e4e1400d02084a0bf7eb8cf06f5c0e88359ac82631 WatchSource:0}: Error finding container 68dcf59b6728afdbf98ec1e4e1400d02084a0bf7eb8cf06f5c0e88359ac82631: Status 404 returned error can't find the container with id 68dcf59b6728afdbf98ec1e4e1400d02084a0bf7eb8cf06f5c0e88359ac82631 Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.866349 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 11:35:18 crc kubenswrapper[4702]: W1203 11:35:18.866743 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbda57dc3_3be8_4feb_a987_62c1412de0ad.slice/crio-dce5c6da070409f09ae04b58b1a6920a0ec755be499ec8807d7e71231d489efa WatchSource:0}: Error finding container dce5c6da070409f09ae04b58b1a6920a0ec755be499ec8807d7e71231d489efa: Status 404 returned error can't find the container with id dce5c6da070409f09ae04b58b1a6920a0ec755be499ec8807d7e71231d489efa Dec 03 11:35:18 crc kubenswrapper[4702]: I1203 11:35:18.946561 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c" path="/var/lib/kubelet/pods/1b019259-47fe-46c2-b4dc-4d6dc2ee6e2c/volumes" Dec 03 11:35:19 crc kubenswrapper[4702]: I1203 11:35:19.003542 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85f53e1b-50d1-4249-ba44-5b2e5982ae36" path="/var/lib/kubelet/pods/85f53e1b-50d1-4249-ba44-5b2e5982ae36/volumes" Dec 03 11:35:19 crc kubenswrapper[4702]: I1203 11:35:19.681372 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"33f03183-33e1-4aa1-8a4c-11f8b75297cd","Type":"ContainerStarted","Data":"68dcf59b6728afdbf98ec1e4e1400d02084a0bf7eb8cf06f5c0e88359ac82631"} Dec 03 11:35:19 crc kubenswrapper[4702]: I1203 11:35:19.683787 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda57dc3-3be8-4feb-a987-62c1412de0ad","Type":"ContainerStarted","Data":"dce5c6da070409f09ae04b58b1a6920a0ec755be499ec8807d7e71231d489efa"} Dec 03 11:35:19 crc kubenswrapper[4702]: I1203 11:35:19.686025 4702 generic.go:334] "Generic (PLEG): container finished" podID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" containerID="cf317845aef7bdddfdad898e0e28e821f980fafda897aadd776ea79a7d22bd08" exitCode=0 Dec 03 11:35:19 crc kubenswrapper[4702]: I1203 11:35:19.686119 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" event={"ID":"a47f12ae-2287-4da5-b66d-4f8cd64abd69","Type":"ContainerDied","Data":"cf317845aef7bdddfdad898e0e28e821f980fafda897aadd776ea79a7d22bd08"} Dec 03 11:35:19 crc kubenswrapper[4702]: I1203 11:35:19.686337 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" event={"ID":"a47f12ae-2287-4da5-b66d-4f8cd64abd69","Type":"ContainerStarted","Data":"36649c6f66d2721048a416571b702723defb08131116393c0b82761ae7a83fb5"} Dec 03 11:35:20 crc kubenswrapper[4702]: I1203 11:35:20.702806 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" event={"ID":"a47f12ae-2287-4da5-b66d-4f8cd64abd69","Type":"ContainerStarted","Data":"ea6a03a94f9947c543157b010f47a5f6e7789eb47594bcaa7226a5883ac794a6"} Dec 03 11:35:20 crc kubenswrapper[4702]: I1203 11:35:20.703146 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:20 crc kubenswrapper[4702]: I1203 11:35:20.736233 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" podStartSLOduration=10.736172156 podStartE2EDuration="10.736172156s" podCreationTimestamp="2025-12-03 11:35:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:35:20.728981451 +0000 UTC m=+1904.564909915" watchObservedRunningTime="2025-12-03 11:35:20.736172156 +0000 UTC m=+1904.572100620" Dec 03 11:35:21 crc kubenswrapper[4702]: I1203 11:35:21.719699 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"33f03183-33e1-4aa1-8a4c-11f8b75297cd","Type":"ContainerStarted","Data":"853595bb66bbbec5fa36cbadb2da5fa4a3f4dce62e4fec8885db08e4cd9fb448"} Dec 03 11:35:21 crc kubenswrapper[4702]: I1203 11:35:21.722085 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda57dc3-3be8-4feb-a987-62c1412de0ad","Type":"ContainerStarted","Data":"3091f2de3c084e29da9425c9f1400e5aa8b5bddba43a347aea9dba27a33be1f6"} Dec 03 11:35:22 crc kubenswrapper[4702]: I1203 11:35:22.735630 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerStarted","Data":"159ee3d241a321aed89bcfa644d4fed939819784a2a494203acf7b31c0a66347"} Dec 03 11:35:22 crc kubenswrapper[4702]: I1203 11:35:22.929918 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:35:22 crc kubenswrapper[4702]: E1203 11:35:22.930174 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:35:23 crc kubenswrapper[4702]: I1203 11:35:23.751362 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerStarted","Data":"4a4d1d81e28532047aaccc56a98fee9fd389b4a28bda4c3010652cb18c7f17ce"} Dec 03 11:35:24 crc kubenswrapper[4702]: I1203 11:35:24.771192 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerStarted","Data":"853324dd0987f96213f5bdb78bc213d546ea6d301d937cb9f87f51a91e5a1f2b"} Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.264702 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.479147 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv"] Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.479421 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" podUID="da604ff7-8464-439e-aa94-29102f336add" containerName="dnsmasq-dns" containerID="cri-o://1046b22c92fb7df7df83779c6c50340fcca3f5911225e5e30da75662ab6b3b4a" gracePeriod=10 Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.687926 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-jxwlk"] Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.691034 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.707319 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-jxwlk"] Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.778820 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.778892 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.778946 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.778997 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.779159 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9w5r\" (UniqueName: \"kubernetes.io/projected/bae49358-6209-44ef-b28e-bd0b48fc617a-kube-api-access-x9w5r\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.779255 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.779310 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-config\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.881571 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.881831 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.882955 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.883000 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.883985 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9w5r\" (UniqueName: \"kubernetes.io/projected/bae49358-6209-44ef-b28e-bd0b48fc617a-kube-api-access-x9w5r\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.884105 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.884168 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-config\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.884263 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.884300 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.885061 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.886345 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-config\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.886498 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.887245 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bae49358-6209-44ef-b28e-bd0b48fc617a-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:26 crc kubenswrapper[4702]: I1203 11:35:26.912923 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9w5r\" (UniqueName: \"kubernetes.io/projected/bae49358-6209-44ef-b28e-bd0b48fc617a-kube-api-access-x9w5r\") pod \"dnsmasq-dns-6f6df4f56c-jxwlk\" (UID: \"bae49358-6209-44ef-b28e-bd0b48fc617a\") " pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:27 crc kubenswrapper[4702]: E1203 11:35:27.007656 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda604ff7_8464_439e_aa94_29102f336add.slice/crio-1046b22c92fb7df7df83779c6c50340fcca3f5911225e5e30da75662ab6b3b4a.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:35:27 crc kubenswrapper[4702]: I1203 11:35:27.014840 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" podUID="da604ff7-8464-439e-aa94-29102f336add" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.255:5353: connect: connection refused" Dec 03 11:35:27 crc kubenswrapper[4702]: I1203 11:35:27.057950 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:27 crc kubenswrapper[4702]: I1203 11:35:27.612654 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-jxwlk"] Dec 03 11:35:27 crc kubenswrapper[4702]: W1203 11:35:27.614463 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbae49358_6209_44ef_b28e_bd0b48fc617a.slice/crio-e6283d5210394abc0cbe90a789a2492692029bad8b30264e62af7a86006e058e WatchSource:0}: Error finding container e6283d5210394abc0cbe90a789a2492692029bad8b30264e62af7a86006e058e: Status 404 returned error can't find the container with id e6283d5210394abc0cbe90a789a2492692029bad8b30264e62af7a86006e058e Dec 03 11:35:27 crc kubenswrapper[4702]: I1203 11:35:27.874267 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" event={"ID":"bae49358-6209-44ef-b28e-bd0b48fc617a","Type":"ContainerStarted","Data":"e6283d5210394abc0cbe90a789a2492692029bad8b30264e62af7a86006e058e"} Dec 03 11:35:28 crc kubenswrapper[4702]: I1203 11:35:28.888531 4702 generic.go:334] "Generic (PLEG): container finished" podID="bae49358-6209-44ef-b28e-bd0b48fc617a" containerID="981e64af10050acafa8c766d31a158b6b3cbc786d4173d3f5f350a1bde6a8f91" exitCode=0 Dec 03 11:35:28 crc kubenswrapper[4702]: I1203 11:35:28.888584 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" event={"ID":"bae49358-6209-44ef-b28e-bd0b48fc617a","Type":"ContainerDied","Data":"981e64af10050acafa8c766d31a158b6b3cbc786d4173d3f5f350a1bde6a8f91"} Dec 03 11:35:28 crc kubenswrapper[4702]: I1203 11:35:28.891861 4702 generic.go:334] "Generic (PLEG): container finished" podID="da604ff7-8464-439e-aa94-29102f336add" containerID="1046b22c92fb7df7df83779c6c50340fcca3f5911225e5e30da75662ab6b3b4a" exitCode=0 Dec 03 11:35:28 crc kubenswrapper[4702]: I1203 11:35:28.891902 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" event={"ID":"da604ff7-8464-439e-aa94-29102f336add","Type":"ContainerDied","Data":"1046b22c92fb7df7df83779c6c50340fcca3f5911225e5e30da75662ab6b3b4a"} Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.316029 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.574516 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-svc\") pod \"da604ff7-8464-439e-aa94-29102f336add\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.574611 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-sb\") pod \"da604ff7-8464-439e-aa94-29102f336add\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.575280 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-nb\") pod \"da604ff7-8464-439e-aa94-29102f336add\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.575361 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw9bj\" (UniqueName: \"kubernetes.io/projected/da604ff7-8464-439e-aa94-29102f336add-kube-api-access-hw9bj\") pod \"da604ff7-8464-439e-aa94-29102f336add\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.575696 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-swift-storage-0\") pod \"da604ff7-8464-439e-aa94-29102f336add\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.576199 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-config\") pod \"da604ff7-8464-439e-aa94-29102f336add\" (UID: \"da604ff7-8464-439e-aa94-29102f336add\") " Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.583371 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da604ff7-8464-439e-aa94-29102f336add-kube-api-access-hw9bj" (OuterVolumeSpecName: "kube-api-access-hw9bj") pod "da604ff7-8464-439e-aa94-29102f336add" (UID: "da604ff7-8464-439e-aa94-29102f336add"). InnerVolumeSpecName "kube-api-access-hw9bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.654099 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-config" (OuterVolumeSpecName: "config") pod "da604ff7-8464-439e-aa94-29102f336add" (UID: "da604ff7-8464-439e-aa94-29102f336add"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.663496 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "da604ff7-8464-439e-aa94-29102f336add" (UID: "da604ff7-8464-439e-aa94-29102f336add"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.664605 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "da604ff7-8464-439e-aa94-29102f336add" (UID: "da604ff7-8464-439e-aa94-29102f336add"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.665401 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "da604ff7-8464-439e-aa94-29102f336add" (UID: "da604ff7-8464-439e-aa94-29102f336add"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.678928 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "da604ff7-8464-439e-aa94-29102f336add" (UID: "da604ff7-8464-439e-aa94-29102f336add"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.679891 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.680069 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.680153 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.680254 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.680412 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw9bj\" (UniqueName: \"kubernetes.io/projected/da604ff7-8464-439e-aa94-29102f336add-kube-api-access-hw9bj\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.680500 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/da604ff7-8464-439e-aa94-29102f336add-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.935780 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" event={"ID":"bae49358-6209-44ef-b28e-bd0b48fc617a","Type":"ContainerStarted","Data":"409bbf53efe3862c7289a9575ae79dd64558a51aeff32f589f030b9d63de25c8"} Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.936042 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.940683 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerStarted","Data":"cd72b6653ef26d803cddcb816f7eaefb37cb764a19e5714195ad2c44eba5f98e"} Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.940954 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.949585 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" event={"ID":"da604ff7-8464-439e-aa94-29102f336add","Type":"ContainerDied","Data":"4fffbcaa959c66ae4a050c3b7ef0d782b7d2e299a7943a9328de6fb6c4156ad6"} Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.949648 4702 scope.go:117] "RemoveContainer" containerID="1046b22c92fb7df7df83779c6c50340fcca3f5911225e5e30da75662ab6b3b4a" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.949645 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv" Dec 03 11:35:29 crc kubenswrapper[4702]: I1203 11:35:29.972768 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" podStartSLOduration=3.972727118 podStartE2EDuration="3.972727118s" podCreationTimestamp="2025-12-03 11:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:35:29.959551473 +0000 UTC m=+1913.795479957" watchObservedRunningTime="2025-12-03 11:35:29.972727118 +0000 UTC m=+1913.808655582" Dec 03 11:35:30 crc kubenswrapper[4702]: I1203 11:35:29.999905 4702 scope.go:117] "RemoveContainer" containerID="99fd457c22499203cf340d5df7ae15f01cea3cefd4b6e64000df4d996eb41087" Dec 03 11:35:30 crc kubenswrapper[4702]: I1203 11:35:30.009630 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=13.454539002 podStartE2EDuration="24.009607189s" podCreationTimestamp="2025-12-03 11:35:06 +0000 UTC" firstStartedPulling="2025-12-03 11:35:18.512669295 +0000 UTC m=+1902.348597759" lastFinishedPulling="2025-12-03 11:35:29.067737482 +0000 UTC m=+1912.903665946" observedRunningTime="2025-12-03 11:35:30.002747913 +0000 UTC m=+1913.838676377" watchObservedRunningTime="2025-12-03 11:35:30.009607189 +0000 UTC m=+1913.845535643" Dec 03 11:35:30 crc kubenswrapper[4702]: I1203 11:35:30.040685 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv"] Dec 03 11:35:30 crc kubenswrapper[4702]: I1203 11:35:30.052731 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-5lrzv"] Dec 03 11:35:31 crc kubenswrapper[4702]: I1203 11:35:31.058457 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da604ff7-8464-439e-aa94-29102f336add" path="/var/lib/kubelet/pods/da604ff7-8464-439e-aa94-29102f336add/volumes" Dec 03 11:35:32 crc kubenswrapper[4702]: I1203 11:35:32.068382 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-gq8tr" event={"ID":"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54","Type":"ContainerStarted","Data":"f504e6a99f3fd97afe3603b3b2ca8f616014af94e2da3e3beb060139b03aaf1c"} Dec 03 11:35:32 crc kubenswrapper[4702]: I1203 11:35:32.094706 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-gq8tr" podStartSLOduration=2.424633498 podStartE2EDuration="42.094683297s" podCreationTimestamp="2025-12-03 11:34:50 +0000 UTC" firstStartedPulling="2025-12-03 11:34:51.561153653 +0000 UTC m=+1875.397082117" lastFinishedPulling="2025-12-03 11:35:31.231203452 +0000 UTC m=+1915.067131916" observedRunningTime="2025-12-03 11:35:32.08355237 +0000 UTC m=+1915.919480834" watchObservedRunningTime="2025-12-03 11:35:32.094683297 +0000 UTC m=+1915.930611761" Dec 03 11:35:35 crc kubenswrapper[4702]: I1203 11:35:35.107205 4702 generic.go:334] "Generic (PLEG): container finished" podID="8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" containerID="f504e6a99f3fd97afe3603b3b2ca8f616014af94e2da3e3beb060139b03aaf1c" exitCode=0 Dec 03 11:35:35 crc kubenswrapper[4702]: I1203 11:35:35.107307 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-gq8tr" event={"ID":"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54","Type":"ContainerDied","Data":"f504e6a99f3fd97afe3603b3b2ca8f616014af94e2da3e3beb060139b03aaf1c"} Dec 03 11:35:36 crc kubenswrapper[4702]: I1203 11:35:36.990162 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-gq8tr" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.062937 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-jxwlk" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.146079 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-7vwqc"] Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.146396 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" podUID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" containerName="dnsmasq-dns" containerID="cri-o://ea6a03a94f9947c543157b010f47a5f6e7789eb47594bcaa7226a5883ac794a6" gracePeriod=10 Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.148796 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-combined-ca-bundle\") pod \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.149031 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-config-data\") pod \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.149205 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scwrv\" (UniqueName: \"kubernetes.io/projected/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-kube-api-access-scwrv\") pod \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\" (UID: \"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54\") " Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.180333 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-kube-api-access-scwrv" (OuterVolumeSpecName: "kube-api-access-scwrv") pod "8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" (UID: "8cf7ffe2-0a74-42cd-ab56-1a65d1317f54"). InnerVolumeSpecName "kube-api-access-scwrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.184369 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-gq8tr" event={"ID":"8cf7ffe2-0a74-42cd-ab56-1a65d1317f54","Type":"ContainerDied","Data":"cf7859403e9b77fbc6b05044e5b53b2b658b958a3f5d6a634ad3dbee3cb03e89"} Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.184432 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf7859403e9b77fbc6b05044e5b53b2b658b958a3f5d6a634ad3dbee3cb03e89" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.184509 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-gq8tr" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.275990 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scwrv\" (UniqueName: \"kubernetes.io/projected/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-kube-api-access-scwrv\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.283081 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" (UID: "8cf7ffe2-0a74-42cd-ab56-1a65d1317f54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.572704 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.668527 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-config-data" (OuterVolumeSpecName: "config-data") pod "8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" (UID: "8cf7ffe2-0a74-42cd-ab56-1a65d1317f54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:35:37 crc kubenswrapper[4702]: I1203 11:35:37.677265 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.039998 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:35:38 crc kubenswrapper[4702]: E1203 11:35:38.040615 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.203011 4702 generic.go:334] "Generic (PLEG): container finished" podID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" containerID="ea6a03a94f9947c543157b010f47a5f6e7789eb47594bcaa7226a5883ac794a6" exitCode=0 Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.203060 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" event={"ID":"a47f12ae-2287-4da5-b66d-4f8cd64abd69","Type":"ContainerDied","Data":"ea6a03a94f9947c543157b010f47a5f6e7789eb47594bcaa7226a5883ac794a6"} Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.683171 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.843325 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmsf2\" (UniqueName: \"kubernetes.io/projected/a47f12ae-2287-4da5-b66d-4f8cd64abd69-kube-api-access-vmsf2\") pod \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.843786 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-openstack-edpm-ipam\") pod \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.843881 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-svc\") pod \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.843956 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-nb\") pod \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.843995 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-sb\") pod \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.844129 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-swift-storage-0\") pod \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.844191 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-config\") pod \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.856172 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a47f12ae-2287-4da5-b66d-4f8cd64abd69-kube-api-access-vmsf2" (OuterVolumeSpecName: "kube-api-access-vmsf2") pod "a47f12ae-2287-4da5-b66d-4f8cd64abd69" (UID: "a47f12ae-2287-4da5-b66d-4f8cd64abd69"). InnerVolumeSpecName "kube-api-access-vmsf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.930600 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a47f12ae-2287-4da5-b66d-4f8cd64abd69" (UID: "a47f12ae-2287-4da5-b66d-4f8cd64abd69"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.934151 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a47f12ae-2287-4da5-b66d-4f8cd64abd69" (UID: "a47f12ae-2287-4da5-b66d-4f8cd64abd69"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.943420 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a47f12ae-2287-4da5-b66d-4f8cd64abd69" (UID: "a47f12ae-2287-4da5-b66d-4f8cd64abd69"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.945807 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a47f12ae-2287-4da5-b66d-4f8cd64abd69" (UID: "a47f12ae-2287-4da5-b66d-4f8cd64abd69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.947159 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-nb\") pod \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\" (UID: \"a47f12ae-2287-4da5-b66d-4f8cd64abd69\") " Dec 03 11:35:38 crc kubenswrapper[4702]: W1203 11:35:38.948261 4702 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/a47f12ae-2287-4da5-b66d-4f8cd64abd69/volumes/kubernetes.io~configmap/ovsdbserver-nb Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.948292 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a47f12ae-2287-4da5-b66d-4f8cd64abd69" (UID: "a47f12ae-2287-4da5-b66d-4f8cd64abd69"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.950291 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.950553 4702 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.950575 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmsf2\" (UniqueName: \"kubernetes.io/projected/a47f12ae-2287-4da5-b66d-4f8cd64abd69-kube-api-access-vmsf2\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.950586 4702 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.950595 4702 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.954126 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a47f12ae-2287-4da5-b66d-4f8cd64abd69" (UID: "a47f12ae-2287-4da5-b66d-4f8cd64abd69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:38 crc kubenswrapper[4702]: I1203 11:35:38.961108 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-config" (OuterVolumeSpecName: "config") pod "a47f12ae-2287-4da5-b66d-4f8cd64abd69" (UID: "a47f12ae-2287-4da5-b66d-4f8cd64abd69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.059399 4702 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-config\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.059442 4702 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a47f12ae-2287-4da5-b66d-4f8cd64abd69-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.303067 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" event={"ID":"a47f12ae-2287-4da5-b66d-4f8cd64abd69","Type":"ContainerDied","Data":"36649c6f66d2721048a416571b702723defb08131116393c0b82761ae7a83fb5"} Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.303150 4702 scope.go:117] "RemoveContainer" containerID="ea6a03a94f9947c543157b010f47a5f6e7789eb47594bcaa7226a5883ac794a6" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.303389 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-7vwqc" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.351415 4702 scope.go:117] "RemoveContainer" containerID="cf317845aef7bdddfdad898e0e28e821f980fafda897aadd776ea79a7d22bd08" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.356561 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-7vwqc"] Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.373952 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-7vwqc"] Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.520399 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-79b68c69ff-kvztw"] Dec 03 11:35:39 crc kubenswrapper[4702]: E1203 11:35:39.521169 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da604ff7-8464-439e-aa94-29102f336add" containerName="init" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.521194 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="da604ff7-8464-439e-aa94-29102f336add" containerName="init" Dec 03 11:35:39 crc kubenswrapper[4702]: E1203 11:35:39.521217 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" containerName="dnsmasq-dns" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.521225 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" containerName="dnsmasq-dns" Dec 03 11:35:39 crc kubenswrapper[4702]: E1203 11:35:39.521277 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" containerName="heat-db-sync" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.521286 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" containerName="heat-db-sync" Dec 03 11:35:39 crc kubenswrapper[4702]: E1203 11:35:39.521302 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da604ff7-8464-439e-aa94-29102f336add" containerName="dnsmasq-dns" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.521308 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="da604ff7-8464-439e-aa94-29102f336add" containerName="dnsmasq-dns" Dec 03 11:35:39 crc kubenswrapper[4702]: E1203 11:35:39.521327 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" containerName="init" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.521334 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" containerName="init" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.521693 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" containerName="dnsmasq-dns" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.521742 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="da604ff7-8464-439e-aa94-29102f336add" containerName="dnsmasq-dns" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.521790 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" containerName="heat-db-sync" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.523073 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.545202 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79b68c69ff-kvztw"] Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.604115 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6b7dd55484-jsksg"] Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.608907 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.632949 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-56cbbf589-6mqq4"] Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.635164 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.649850 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6b7dd55484-jsksg"] Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.666035 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-56cbbf589-6mqq4"] Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.696346 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-config-data\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.696475 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zln4p\" (UniqueName: \"kubernetes.io/projected/a30c0f33-d7bc-456c-be27-26e860ca8f28-kube-api-access-zln4p\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.696538 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-combined-ca-bundle\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.696584 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-config-data-custom\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.799211 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt22q\" (UniqueName: \"kubernetes.io/projected/92e38875-6121-4c92-b21b-62a280aa8948-kube-api-access-dt22q\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.799404 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-config-data-custom\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.799466 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-config-data\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.799779 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-public-tls-certs\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.799842 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-combined-ca-bundle\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.799894 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-internal-tls-certs\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.799934 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zln4p\" (UniqueName: \"kubernetes.io/projected/a30c0f33-d7bc-456c-be27-26e860ca8f28-kube-api-access-zln4p\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800130 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-combined-ca-bundle\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800162 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-public-tls-certs\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800280 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-config-data-custom\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800300 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-config-data\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800341 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-config-data-custom\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800406 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-combined-ca-bundle\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800466 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-internal-tls-certs\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800495 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lgh5\" (UniqueName: \"kubernetes.io/projected/31eea135-16c2-46e7-860b-418c61ef127e-kube-api-access-2lgh5\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.800519 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-config-data\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.805770 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-combined-ca-bundle\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.807596 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-config-data-custom\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.819472 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a30c0f33-d7bc-456c-be27-26e860ca8f28-config-data\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.820027 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zln4p\" (UniqueName: \"kubernetes.io/projected/a30c0f33-d7bc-456c-be27-26e860ca8f28-kube-api-access-zln4p\") pod \"heat-engine-79b68c69ff-kvztw\" (UID: \"a30c0f33-d7bc-456c-be27-26e860ca8f28\") " pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:39 crc kubenswrapper[4702]: I1203 11:35:39.846133 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.147037 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt22q\" (UniqueName: \"kubernetes.io/projected/92e38875-6121-4c92-b21b-62a280aa8948-kube-api-access-dt22q\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.150971 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-config-data-custom\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.151522 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-public-tls-certs\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.152069 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-combined-ca-bundle\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.152188 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-internal-tls-certs\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.152549 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-public-tls-certs\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.152751 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-config-data\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.153075 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-config-data-custom\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.153472 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-combined-ca-bundle\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.153559 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-internal-tls-certs\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.153587 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lgh5\" (UniqueName: \"kubernetes.io/projected/31eea135-16c2-46e7-860b-418c61ef127e-kube-api-access-2lgh5\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.153617 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-config-data\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.157584 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-internal-tls-certs\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.158139 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-public-tls-certs\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.158816 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-config-data-custom\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.158933 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-public-tls-certs\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.164625 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-internal-tls-certs\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.165629 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-config-data\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.165825 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-config-data\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.167040 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eea135-16c2-46e7-860b-418c61ef127e-combined-ca-bundle\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.167056 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-combined-ca-bundle\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.168177 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92e38875-6121-4c92-b21b-62a280aa8948-config-data-custom\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.195127 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lgh5\" (UniqueName: \"kubernetes.io/projected/31eea135-16c2-46e7-860b-418c61ef127e-kube-api-access-2lgh5\") pod \"heat-api-6b7dd55484-jsksg\" (UID: \"31eea135-16c2-46e7-860b-418c61ef127e\") " pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.196409 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt22q\" (UniqueName: \"kubernetes.io/projected/92e38875-6121-4c92-b21b-62a280aa8948-kube-api-access-dt22q\") pod \"heat-cfnapi-56cbbf589-6mqq4\" (UID: \"92e38875-6121-4c92-b21b-62a280aa8948\") " pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.233541 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:40 crc kubenswrapper[4702]: I1203 11:35:40.264970 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:41 crc kubenswrapper[4702]: I1203 11:35:41.281368 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a47f12ae-2287-4da5-b66d-4f8cd64abd69" path="/var/lib/kubelet/pods/a47f12ae-2287-4da5-b66d-4f8cd64abd69/volumes" Dec 03 11:35:41 crc kubenswrapper[4702]: I1203 11:35:41.296038 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-79b68c69ff-kvztw"] Dec 03 11:35:41 crc kubenswrapper[4702]: I1203 11:35:41.620819 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6b7dd55484-jsksg"] Dec 03 11:35:41 crc kubenswrapper[4702]: W1203 11:35:41.645338 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31eea135_16c2_46e7_860b_418c61ef127e.slice/crio-1fd766419b7380f8925542636c1a17c3d3e065ac36e6e833e2a8f022e4cdb2ce WatchSource:0}: Error finding container 1fd766419b7380f8925542636c1a17c3d3e065ac36e6e833e2a8f022e4cdb2ce: Status 404 returned error can't find the container with id 1fd766419b7380f8925542636c1a17c3d3e065ac36e6e833e2a8f022e4cdb2ce Dec 03 11:35:41 crc kubenswrapper[4702]: I1203 11:35:41.767301 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-56cbbf589-6mqq4"] Dec 03 11:35:41 crc kubenswrapper[4702]: W1203 11:35:41.772420 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92e38875_6121_4c92_b21b_62a280aa8948.slice/crio-2b0740683577e58f08faab09516f38043d96fdd434fe19262d0a7eba728274a2 WatchSource:0}: Error finding container 2b0740683577e58f08faab09516f38043d96fdd434fe19262d0a7eba728274a2: Status 404 returned error can't find the container with id 2b0740683577e58f08faab09516f38043d96fdd434fe19262d0a7eba728274a2 Dec 03 11:35:42 crc kubenswrapper[4702]: I1203 11:35:42.642087 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-56cbbf589-6mqq4" event={"ID":"92e38875-6121-4c92-b21b-62a280aa8948","Type":"ContainerStarted","Data":"2b0740683577e58f08faab09516f38043d96fdd434fe19262d0a7eba728274a2"} Dec 03 11:35:42 crc kubenswrapper[4702]: I1203 11:35:42.657672 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79b68c69ff-kvztw" event={"ID":"a30c0f33-d7bc-456c-be27-26e860ca8f28","Type":"ContainerStarted","Data":"b0d3955eff71f1aab33306f80af14941136741e61d89104987585c4458ad0de7"} Dec 03 11:35:42 crc kubenswrapper[4702]: I1203 11:35:42.660901 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6b7dd55484-jsksg" event={"ID":"31eea135-16c2-46e7-860b-418c61ef127e","Type":"ContainerStarted","Data":"1fd766419b7380f8925542636c1a17c3d3e065ac36e6e833e2a8f022e4cdb2ce"} Dec 03 11:35:43 crc kubenswrapper[4702]: I1203 11:35:43.691920 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-79b68c69ff-kvztw" event={"ID":"a30c0f33-d7bc-456c-be27-26e860ca8f28","Type":"ContainerStarted","Data":"d07bd093a3674db691cb220dc42e9850b9d9f1f0f8bec13a9397f2adfcc1250c"} Dec 03 11:35:43 crc kubenswrapper[4702]: I1203 11:35:43.692344 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:43 crc kubenswrapper[4702]: I1203 11:35:43.732507 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-79b68c69ff-kvztw" podStartSLOduration=4.732486893 podStartE2EDuration="4.732486893s" podCreationTimestamp="2025-12-03 11:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:35:43.721396987 +0000 UTC m=+1927.557325461" watchObservedRunningTime="2025-12-03 11:35:43.732486893 +0000 UTC m=+1927.568415357" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.627417 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j"] Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.629549 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.633353 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.634186 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.634505 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.642823 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j"] Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.646549 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.721577 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.721898 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.722044 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgcx\" (UniqueName: \"kubernetes.io/projected/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-kube-api-access-8dgcx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.722201 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.824960 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.825134 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.825185 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dgcx\" (UniqueName: \"kubernetes.io/projected/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-kube-api-access-8dgcx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.825960 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.831509 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.840327 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.843531 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.845390 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dgcx\" (UniqueName: \"kubernetes.io/projected/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-kube-api-access-8dgcx\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:46 crc kubenswrapper[4702]: I1203 11:35:46.968436 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:35:48 crc kubenswrapper[4702]: I1203 11:35:48.514800 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j"] Dec 03 11:35:48 crc kubenswrapper[4702]: I1203 11:35:48.815768 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-56cbbf589-6mqq4" event={"ID":"92e38875-6121-4c92-b21b-62a280aa8948","Type":"ContainerStarted","Data":"6adc22f5cd6e8bf6d56dbde86a9a0c15038a4867923ff7b50a69e6a17ce2d9b7"} Dec 03 11:35:48 crc kubenswrapper[4702]: I1203 11:35:48.816989 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:48 crc kubenswrapper[4702]: I1203 11:35:48.823365 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" event={"ID":"f52f9dbe-66ac-479c-b673-7fa2fbaccf71","Type":"ContainerStarted","Data":"58ce149bc62d7328c122ae3c82ae25f769d0fc2bb1f500f32f4657604484d982"} Dec 03 11:35:48 crc kubenswrapper[4702]: I1203 11:35:48.830743 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6b7dd55484-jsksg" event={"ID":"31eea135-16c2-46e7-860b-418c61ef127e","Type":"ContainerStarted","Data":"06e5bd5dd2eeaa330ad35a5e4cfadaaa1eefb71aa5c680bae145a01a8f71c1d7"} Dec 03 11:35:48 crc kubenswrapper[4702]: I1203 11:35:48.831413 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:48 crc kubenswrapper[4702]: I1203 11:35:48.852713 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-56cbbf589-6mqq4" podStartSLOduration=3.905680404 podStartE2EDuration="9.85269015s" podCreationTimestamp="2025-12-03 11:35:39 +0000 UTC" firstStartedPulling="2025-12-03 11:35:41.77538513 +0000 UTC m=+1925.611313594" lastFinishedPulling="2025-12-03 11:35:47.722394876 +0000 UTC m=+1931.558323340" observedRunningTime="2025-12-03 11:35:48.837005523 +0000 UTC m=+1932.672933997" watchObservedRunningTime="2025-12-03 11:35:48.85269015 +0000 UTC m=+1932.688618614" Dec 03 11:35:48 crc kubenswrapper[4702]: I1203 11:35:48.877056 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6b7dd55484-jsksg" podStartSLOduration=3.800283883 podStartE2EDuration="9.877033714s" podCreationTimestamp="2025-12-03 11:35:39 +0000 UTC" firstStartedPulling="2025-12-03 11:35:41.648150886 +0000 UTC m=+1925.484079350" lastFinishedPulling="2025-12-03 11:35:47.724900727 +0000 UTC m=+1931.560829181" observedRunningTime="2025-12-03 11:35:48.865027552 +0000 UTC m=+1932.700956016" watchObservedRunningTime="2025-12-03 11:35:48.877033714 +0000 UTC m=+1932.712962178" Dec 03 11:35:51 crc kubenswrapper[4702]: I1203 11:35:51.929182 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:35:51 crc kubenswrapper[4702]: E1203 11:35:51.929912 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:35:54 crc kubenswrapper[4702]: I1203 11:35:54.935616 4702 generic.go:334] "Generic (PLEG): container finished" podID="33f03183-33e1-4aa1-8a4c-11f8b75297cd" containerID="853595bb66bbbec5fa36cbadb2da5fa4a3f4dce62e4fec8885db08e4cd9fb448" exitCode=0 Dec 03 11:35:54 crc kubenswrapper[4702]: I1203 11:35:54.940600 4702 generic.go:334] "Generic (PLEG): container finished" podID="bda57dc3-3be8-4feb-a987-62c1412de0ad" containerID="3091f2de3c084e29da9425c9f1400e5aa8b5bddba43a347aea9dba27a33be1f6" exitCode=0 Dec 03 11:35:54 crc kubenswrapper[4702]: I1203 11:35:54.944066 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"33f03183-33e1-4aa1-8a4c-11f8b75297cd","Type":"ContainerDied","Data":"853595bb66bbbec5fa36cbadb2da5fa4a3f4dce62e4fec8885db08e4cd9fb448"} Dec 03 11:35:54 crc kubenswrapper[4702]: I1203 11:35:54.944121 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda57dc3-3be8-4feb-a987-62c1412de0ad","Type":"ContainerDied","Data":"3091f2de3c084e29da9425c9f1400e5aa8b5bddba43a347aea9dba27a33be1f6"} Dec 03 11:35:56 crc kubenswrapper[4702]: I1203 11:35:56.181214 4702 scope.go:117] "RemoveContainer" containerID="6d0ecece3d77561615396eefaec02567322e8c7aff8df432d1ee6b49f4aaa42b" Dec 03 11:35:58 crc kubenswrapper[4702]: I1203 11:35:58.093723 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6b7dd55484-jsksg" Dec 03 11:35:58 crc kubenswrapper[4702]: I1203 11:35:58.437170 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5df49db6bf-cnm46"] Dec 03 11:35:58 crc kubenswrapper[4702]: I1203 11:35:58.444725 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5df49db6bf-cnm46" podUID="6fad49cd-d636-43cc-84f7-7c8e0774a93a" containerName="heat-api" containerID="cri-o://beb14e6e68c81110a37a5577e4ccedc4cc03bc33e7b8ba69d46f559f82443abe" gracePeriod=60 Dec 03 11:35:58 crc kubenswrapper[4702]: I1203 11:35:58.457586 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-56cbbf589-6mqq4" Dec 03 11:35:58 crc kubenswrapper[4702]: I1203 11:35:58.574926 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76f7956d4d-b967j"] Dec 03 11:35:58 crc kubenswrapper[4702]: I1203 11:35:58.575172 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-76f7956d4d-b967j" podUID="d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" containerName="heat-cfnapi" containerID="cri-o://78575d1b4cdcb62d54fbf3e36fbe1e08f77e1f1c00c756579b087203c7eeb1c3" gracePeriod=60 Dec 03 11:35:59 crc kubenswrapper[4702]: I1203 11:35:59.844423 4702 scope.go:117] "RemoveContainer" containerID="38f6eadf1b9144c46cc2393ac1c4364175bd42eb25b7113aedbb600488c1ea50" Dec 03 11:35:59 crc kubenswrapper[4702]: I1203 11:35:59.916302 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-79b68c69ff-kvztw" Dec 03 11:35:59 crc kubenswrapper[4702]: I1203 11:35:59.995778 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-58868c9476-5hnsv"] Dec 03 11:35:59 crc kubenswrapper[4702]: I1203 11:35:59.995995 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-58868c9476-5hnsv" podUID="1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" containerName="heat-engine" containerID="cri-o://8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" gracePeriod=60 Dec 03 11:36:01 crc kubenswrapper[4702]: I1203 11:36:01.152269 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" event={"ID":"f52f9dbe-66ac-479c-b673-7fa2fbaccf71","Type":"ContainerStarted","Data":"572fbac7a385d7c4db161d03ad8f138653d9d20a171b5d7c9157885ee59c3691"} Dec 03 11:36:01 crc kubenswrapper[4702]: I1203 11:36:01.156415 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"33f03183-33e1-4aa1-8a4c-11f8b75297cd","Type":"ContainerStarted","Data":"7e9620c3342773ff49bc43a8cd30a8bc315457915efae7cbe4d39fc286730419"} Dec 03 11:36:01 crc kubenswrapper[4702]: I1203 11:36:01.156664 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 11:36:01 crc kubenswrapper[4702]: I1203 11:36:01.159579 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bda57dc3-3be8-4feb-a987-62c1412de0ad","Type":"ContainerStarted","Data":"69db51230bb41914dc60951676cd7aa3521e2e23d6a705c4075e2b7df14dade1"} Dec 03 11:36:01 crc kubenswrapper[4702]: I1203 11:36:01.159845 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:36:01 crc kubenswrapper[4702]: I1203 11:36:01.177530 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" podStartSLOduration=3.664724539 podStartE2EDuration="15.177493073s" podCreationTimestamp="2025-12-03 11:35:46 +0000 UTC" firstStartedPulling="2025-12-03 11:35:48.519029647 +0000 UTC m=+1932.354958111" lastFinishedPulling="2025-12-03 11:36:00.031798191 +0000 UTC m=+1943.867726645" observedRunningTime="2025-12-03 11:36:01.171078831 +0000 UTC m=+1945.007007315" watchObservedRunningTime="2025-12-03 11:36:01.177493073 +0000 UTC m=+1945.013421547" Dec 03 11:36:01 crc kubenswrapper[4702]: I1203 11:36:01.210944 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.210919105 podStartE2EDuration="44.210919105s" podCreationTimestamp="2025-12-03 11:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:36:01.202044373 +0000 UTC m=+1945.037972837" watchObservedRunningTime="2025-12-03 11:36:01.210919105 +0000 UTC m=+1945.046847569" Dec 03 11:36:01 crc kubenswrapper[4702]: I1203 11:36:01.248879 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=44.248857946 podStartE2EDuration="44.248857946s" podCreationTimestamp="2025-12-03 11:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 11:36:01.230075241 +0000 UTC m=+1945.066003725" watchObservedRunningTime="2025-12-03 11:36:01.248857946 +0000 UTC m=+1945.084786410" Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.037369 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-5df49db6bf-cnm46" podUID="6fad49cd-d636-43cc-84f7-7c8e0774a93a" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.218:8004/healthcheck\": read tcp 10.217.0.2:43284->10.217.0.218:8004: read: connection reset by peer" Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.047304 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-76f7956d4d-b967j" podUID="d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.219:8000/healthcheck\": read tcp 10.217.0.2:43346->10.217.0.219:8000: read: connection reset by peer" Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.201487 4702 generic.go:334] "Generic (PLEG): container finished" podID="6fad49cd-d636-43cc-84f7-7c8e0774a93a" containerID="beb14e6e68c81110a37a5577e4ccedc4cc03bc33e7b8ba69d46f559f82443abe" exitCode=0 Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.201925 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5df49db6bf-cnm46" event={"ID":"6fad49cd-d636-43cc-84f7-7c8e0774a93a","Type":"ContainerDied","Data":"beb14e6e68c81110a37a5577e4ccedc4cc03bc33e7b8ba69d46f559f82443abe"} Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.208638 4702 generic.go:334] "Generic (PLEG): container finished" podID="d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" containerID="78575d1b4cdcb62d54fbf3e36fbe1e08f77e1f1c00c756579b087203c7eeb1c3" exitCode=0 Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.210122 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76f7956d4d-b967j" event={"ID":"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa","Type":"ContainerDied","Data":"78575d1b4cdcb62d54fbf3e36fbe1e08f77e1f1c00c756579b087203c7eeb1c3"} Dec 03 11:36:02 crc kubenswrapper[4702]: E1203 11:36:02.473315 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:36:02 crc kubenswrapper[4702]: E1203 11:36:02.474471 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:36:02 crc kubenswrapper[4702]: E1203 11:36:02.480101 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:36:02 crc kubenswrapper[4702]: E1203 11:36:02.480211 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-58868c9476-5hnsv" podUID="1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" containerName="heat-engine" Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.879350 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.978889 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-public-tls-certs\") pod \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.982429 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-combined-ca-bundle\") pod \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.982474 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data\") pod \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.982524 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s57z7\" (UniqueName: \"kubernetes.io/projected/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-kube-api-access-s57z7\") pod \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.982772 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data-custom\") pod \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " Dec 03 11:36:02 crc kubenswrapper[4702]: I1203 11:36:02.982823 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-internal-tls-certs\") pod \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\" (UID: \"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa\") " Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.008082 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-kube-api-access-s57z7" (OuterVolumeSpecName: "kube-api-access-s57z7") pod "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" (UID: "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa"). InnerVolumeSpecName "kube-api-access-s57z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.046912 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" (UID: "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.102379 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s57z7\" (UniqueName: \"kubernetes.io/projected/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-kube-api-access-s57z7\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.102432 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.231431 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" (UID: "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.299923 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76f7956d4d-b967j" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.315520 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.355134 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" (UID: "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.398896 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data" (OuterVolumeSpecName: "config-data") pod "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" (UID: "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.418521 4702 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.418561 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.428892 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" (UID: "d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.494964 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5df49db6bf-cnm46" event={"ID":"6fad49cd-d636-43cc-84f7-7c8e0774a93a","Type":"ContainerDied","Data":"0f0e478ec360aa48bfda13635430903340e00a91028f0bddc950839af3f38432"} Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.495035 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f0e478ec360aa48bfda13635430903340e00a91028f0bddc950839af3f38432" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.495052 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76f7956d4d-b967j" event={"ID":"d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa","Type":"ContainerDied","Data":"8cfdac583e30b25446aebedf4e8958b8a59d51bfa8357b88f1f55772ce4fe64a"} Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.495103 4702 scope.go:117] "RemoveContainer" containerID="78575d1b4cdcb62d54fbf3e36fbe1e08f77e1f1c00c756579b087203c7eeb1c3" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.506804 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.530299 4702 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.632560 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-internal-tls-certs\") pod \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.632719 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data\") pod \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.632744 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-public-tls-certs\") pod \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.632918 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r866x\" (UniqueName: \"kubernetes.io/projected/6fad49cd-d636-43cc-84f7-7c8e0774a93a-kube-api-access-r866x\") pod \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.632987 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-combined-ca-bundle\") pod \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.633046 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data-custom\") pod \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\" (UID: \"6fad49cd-d636-43cc-84f7-7c8e0774a93a\") " Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.647666 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fad49cd-d636-43cc-84f7-7c8e0774a93a-kube-api-access-r866x" (OuterVolumeSpecName: "kube-api-access-r866x") pod "6fad49cd-d636-43cc-84f7-7c8e0774a93a" (UID: "6fad49cd-d636-43cc-84f7-7c8e0774a93a"). InnerVolumeSpecName "kube-api-access-r866x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.671390 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6fad49cd-d636-43cc-84f7-7c8e0774a93a" (UID: "6fad49cd-d636-43cc-84f7-7c8e0774a93a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.678189 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76f7956d4d-b967j"] Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.695204 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-76f7956d4d-b967j"] Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.726201 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6fad49cd-d636-43cc-84f7-7c8e0774a93a" (UID: "6fad49cd-d636-43cc-84f7-7c8e0774a93a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.738913 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6fad49cd-d636-43cc-84f7-7c8e0774a93a" (UID: "6fad49cd-d636-43cc-84f7-7c8e0774a93a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.743870 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r866x\" (UniqueName: \"kubernetes.io/projected/6fad49cd-d636-43cc-84f7-7c8e0774a93a-kube-api-access-r866x\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.743905 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.743917 4702 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.743932 4702 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.747943 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fad49cd-d636-43cc-84f7-7c8e0774a93a" (UID: "6fad49cd-d636-43cc-84f7-7c8e0774a93a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.770010 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data" (OuterVolumeSpecName: "config-data") pod "6fad49cd-d636-43cc-84f7-7c8e0774a93a" (UID: "6fad49cd-d636-43cc-84f7-7c8e0774a93a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.845674 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:03 crc kubenswrapper[4702]: I1203 11:36:03.845718 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fad49cd-d636-43cc-84f7-7c8e0774a93a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:04 crc kubenswrapper[4702]: I1203 11:36:04.347285 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5df49db6bf-cnm46" Dec 03 11:36:04 crc kubenswrapper[4702]: I1203 11:36:04.403458 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5df49db6bf-cnm46"] Dec 03 11:36:04 crc kubenswrapper[4702]: I1203 11:36:04.427479 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5df49db6bf-cnm46"] Dec 03 11:36:04 crc kubenswrapper[4702]: I1203 11:36:04.934777 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:36:04 crc kubenswrapper[4702]: E1203 11:36:04.935158 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:36:04 crc kubenswrapper[4702]: I1203 11:36:04.946107 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fad49cd-d636-43cc-84f7-7c8e0774a93a" path="/var/lib/kubelet/pods/6fad49cd-d636-43cc-84f7-7c8e0774a93a/volumes" Dec 03 11:36:04 crc kubenswrapper[4702]: I1203 11:36:04.949180 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" path="/var/lib/kubelet/pods/d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa/volumes" Dec 03 11:36:06 crc kubenswrapper[4702]: I1203 11:36:06.830617 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.864910 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-7k9st"] Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.886731 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-7k9st"] Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.950600 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd" path="/var/lib/kubelet/pods/4f6d6fb9-a85a-45cf-b203-1f0e7f9caacd/volumes" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.974104 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-6tgbz"] Dec 03 11:36:10 crc kubenswrapper[4702]: E1203 11:36:10.974912 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" containerName="heat-cfnapi" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.974952 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" containerName="heat-cfnapi" Dec 03 11:36:10 crc kubenswrapper[4702]: E1203 11:36:10.974997 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fad49cd-d636-43cc-84f7-7c8e0774a93a" containerName="heat-api" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.975007 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fad49cd-d636-43cc-84f7-7c8e0774a93a" containerName="heat-api" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.975314 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fad49cd-d636-43cc-84f7-7c8e0774a93a" containerName="heat-api" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.975344 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93d4d6e-a772-4bf6-94ae-dbeb50fcd5aa" containerName="heat-cfnapi" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.976452 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.984695 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 11:36:10 crc kubenswrapper[4702]: I1203 11:36:10.989440 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-6tgbz"] Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.073818 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-config-data\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.073953 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9cvz\" (UniqueName: \"kubernetes.io/projected/82a7b79c-25b3-4bd5-ab40-d9654c454997-kube-api-access-b9cvz\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.074019 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-combined-ca-bundle\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.074064 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-scripts\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.175865 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-scripts\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.176117 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-config-data\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.176220 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9cvz\" (UniqueName: \"kubernetes.io/projected/82a7b79c-25b3-4bd5-ab40-d9654c454997-kube-api-access-b9cvz\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.176331 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-combined-ca-bundle\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.183590 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-combined-ca-bundle\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.184036 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-config-data\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.184246 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-scripts\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.198484 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9cvz\" (UniqueName: \"kubernetes.io/projected/82a7b79c-25b3-4bd5-ab40-d9654c454997-kube-api-access-b9cvz\") pod \"aodh-db-sync-6tgbz\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.304381 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:11 crc kubenswrapper[4702]: I1203 11:36:11.918349 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-6tgbz"] Dec 03 11:36:12 crc kubenswrapper[4702]: E1203 11:36:12.470177 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:36:12 crc kubenswrapper[4702]: E1203 11:36:12.472195 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:36:12 crc kubenswrapper[4702]: E1203 11:36:12.474005 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 11:36:12 crc kubenswrapper[4702]: E1203 11:36:12.474112 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-58868c9476-5hnsv" podUID="1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" containerName="heat-engine" Dec 03 11:36:12 crc kubenswrapper[4702]: I1203 11:36:12.508404 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-6tgbz" event={"ID":"82a7b79c-25b3-4bd5-ab40-d9654c454997","Type":"ContainerStarted","Data":"5dc4196c72637bc5c3182de45517accc26ce45dc7683e775abfff461662fb1a7"} Dec 03 11:36:14 crc kubenswrapper[4702]: I1203 11:36:14.539495 4702 generic.go:334] "Generic (PLEG): container finished" podID="f52f9dbe-66ac-479c-b673-7fa2fbaccf71" containerID="572fbac7a385d7c4db161d03ad8f138653d9d20a171b5d7c9157885ee59c3691" exitCode=0 Dec 03 11:36:14 crc kubenswrapper[4702]: I1203 11:36:14.539558 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" event={"ID":"f52f9dbe-66ac-479c-b673-7fa2fbaccf71","Type":"ContainerDied","Data":"572fbac7a385d7c4db161d03ad8f138653d9d20a171b5d7c9157885ee59c3691"} Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.576343 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" event={"ID":"f52f9dbe-66ac-479c-b673-7fa2fbaccf71","Type":"ContainerDied","Data":"58ce149bc62d7328c122ae3c82ae25f769d0fc2bb1f500f32f4657604484d982"} Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.576962 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58ce149bc62d7328c122ae3c82ae25f769d0fc2bb1f500f32f4657604484d982" Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.579710 4702 generic.go:334] "Generic (PLEG): container finished" podID="1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" containerID="8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" exitCode=0 Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.579782 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-58868c9476-5hnsv" event={"ID":"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852","Type":"ContainerDied","Data":"8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55"} Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.767519 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.939293 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.964515 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dgcx\" (UniqueName: \"kubernetes.io/projected/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-kube-api-access-8dgcx\") pod \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.964675 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-ssh-key\") pod \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.964748 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-repo-setup-combined-ca-bundle\") pod \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.964893 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-inventory\") pod \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\" (UID: \"f52f9dbe-66ac-479c-b673-7fa2fbaccf71\") " Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.974562 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "f52f9dbe-66ac-479c-b673-7fa2fbaccf71" (UID: "f52f9dbe-66ac-479c-b673-7fa2fbaccf71"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:16 crc kubenswrapper[4702]: I1203 11:36:16.976192 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-kube-api-access-8dgcx" (OuterVolumeSpecName: "kube-api-access-8dgcx") pod "f52f9dbe-66ac-479c-b673-7fa2fbaccf71" (UID: "f52f9dbe-66ac-479c-b673-7fa2fbaccf71"). InnerVolumeSpecName "kube-api-access-8dgcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.020290 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f52f9dbe-66ac-479c-b673-7fa2fbaccf71" (UID: "f52f9dbe-66ac-479c-b673-7fa2fbaccf71"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.042048 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-inventory" (OuterVolumeSpecName: "inventory") pod "f52f9dbe-66ac-479c-b673-7fa2fbaccf71" (UID: "f52f9dbe-66ac-479c-b673-7fa2fbaccf71"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.067497 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tzwl\" (UniqueName: \"kubernetes.io/projected/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-kube-api-access-4tzwl\") pod \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.067683 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-combined-ca-bundle\") pod \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.067705 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data-custom\") pod \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.067736 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data\") pod \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\" (UID: \"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852\") " Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.068555 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dgcx\" (UniqueName: \"kubernetes.io/projected/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-kube-api-access-8dgcx\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.068573 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.068608 4702 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.068647 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f52f9dbe-66ac-479c-b673-7fa2fbaccf71-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.070917 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-kube-api-access-4tzwl" (OuterVolumeSpecName: "kube-api-access-4tzwl") pod "1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" (UID: "1398d8ec-f7ad-4fc9-8fa5-b2c86d507852"). InnerVolumeSpecName "kube-api-access-4tzwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.075523 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" (UID: "1398d8ec-f7ad-4fc9-8fa5-b2c86d507852"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.105961 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" (UID: "1398d8ec-f7ad-4fc9-8fa5-b2c86d507852"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.129679 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data" (OuterVolumeSpecName: "config-data") pod "1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" (UID: "1398d8ec-f7ad-4fc9-8fa5-b2c86d507852"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.171669 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tzwl\" (UniqueName: \"kubernetes.io/projected/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-kube-api-access-4tzwl\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.171709 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.171722 4702 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.171761 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.593367 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-6tgbz" event={"ID":"82a7b79c-25b3-4bd5-ab40-d9654c454997","Type":"ContainerStarted","Data":"5a478d692d47e14d886868adc7feb0d40d44c76b680c276ad3b4f414b268636a"} Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.595345 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-58868c9476-5hnsv" event={"ID":"1398d8ec-f7ad-4fc9-8fa5-b2c86d507852","Type":"ContainerDied","Data":"4d657cd1f5984aa69a02a0e558e8018930811ea1a137298708a4a5da20893108"} Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.595376 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.595398 4702 scope.go:117] "RemoveContainer" containerID="8d835acefa04010698685dc1464e59c5c6c1f4fe38644940ad96df853d62db55" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.595418 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-58868c9476-5hnsv" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.630106 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-6tgbz" podStartSLOduration=2.9898595500000003 podStartE2EDuration="7.630084397s" podCreationTimestamp="2025-12-03 11:36:10 +0000 UTC" firstStartedPulling="2025-12-03 11:36:11.928044188 +0000 UTC m=+1955.763972652" lastFinishedPulling="2025-12-03 11:36:16.568269035 +0000 UTC m=+1960.404197499" observedRunningTime="2025-12-03 11:36:17.618323022 +0000 UTC m=+1961.454251486" watchObservedRunningTime="2025-12-03 11:36:17.630084397 +0000 UTC m=+1961.466012861" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.661008 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-58868c9476-5hnsv"] Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.672891 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-58868c9476-5hnsv"] Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.860295 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq"] Dec 03 11:36:17 crc kubenswrapper[4702]: E1203 11:36:17.861132 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" containerName="heat-engine" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.861162 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" containerName="heat-engine" Dec 03 11:36:17 crc kubenswrapper[4702]: E1203 11:36:17.861211 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52f9dbe-66ac-479c-b673-7fa2fbaccf71" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.861221 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52f9dbe-66ac-479c-b673-7fa2fbaccf71" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.861585 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" containerName="heat-engine" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.861616 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52f9dbe-66ac-479c-b673-7fa2fbaccf71" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.862840 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.864798 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.865431 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.865619 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.865765 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.889824 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq"] Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.928393 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:36:17 crc kubenswrapper[4702]: E1203 11:36:17.928741 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.991946 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.992044 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:17 crc kubenswrapper[4702]: I1203 11:36:17.992187 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dfwv\" (UniqueName: \"kubernetes.io/projected/90df2c2b-def7-4eda-896e-a551bfecb98c-kube-api-access-4dfwv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.093961 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dfwv\" (UniqueName: \"kubernetes.io/projected/90df2c2b-def7-4eda-896e-a551bfecb98c-kube-api-access-4dfwv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.094339 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.094881 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.099238 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.100031 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.116455 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dfwv\" (UniqueName: \"kubernetes.io/projected/90df2c2b-def7-4eda-896e-a551bfecb98c-kube-api-access-4dfwv\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-wjztq\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.157004 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.171956 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.182304 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.969292 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1398d8ec-f7ad-4fc9-8fa5-b2c86d507852" path="/var/lib/kubelet/pods/1398d8ec-f7ad-4fc9-8fa5-b2c86d507852/volumes" Dec 03 11:36:18 crc kubenswrapper[4702]: I1203 11:36:18.988346 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:36:19 crc kubenswrapper[4702]: I1203 11:36:19.007592 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq"] Dec 03 11:36:19 crc kubenswrapper[4702]: I1203 11:36:19.627193 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" event={"ID":"90df2c2b-def7-4eda-896e-a551bfecb98c","Type":"ContainerStarted","Data":"6da00662f393aff901fbaba996f120846bb81e40aaba0e8a1b663b1aad23ff16"} Dec 03 11:36:20 crc kubenswrapper[4702]: I1203 11:36:20.641450 4702 generic.go:334] "Generic (PLEG): container finished" podID="82a7b79c-25b3-4bd5-ab40-d9654c454997" containerID="5a478d692d47e14d886868adc7feb0d40d44c76b680c276ad3b4f414b268636a" exitCode=0 Dec 03 11:36:20 crc kubenswrapper[4702]: I1203 11:36:20.641543 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-6tgbz" event={"ID":"82a7b79c-25b3-4bd5-ab40-d9654c454997","Type":"ContainerDied","Data":"5a478d692d47e14d886868adc7feb0d40d44c76b680c276ad3b4f414b268636a"} Dec 03 11:36:20 crc kubenswrapper[4702]: I1203 11:36:20.644207 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" event={"ID":"90df2c2b-def7-4eda-896e-a551bfecb98c","Type":"ContainerStarted","Data":"3e3ac038ed09d37fa4d0b3db7fa93836d41613dafbec088b7fb8df5d15754c86"} Dec 03 11:36:20 crc kubenswrapper[4702]: I1203 11:36:20.697902 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" podStartSLOduration=3.246678895 podStartE2EDuration="3.697878266s" podCreationTimestamp="2025-12-03 11:36:17 +0000 UTC" firstStartedPulling="2025-12-03 11:36:18.988018755 +0000 UTC m=+1962.823947219" lastFinishedPulling="2025-12-03 11:36:19.439218126 +0000 UTC m=+1963.275146590" observedRunningTime="2025-12-03 11:36:20.687176512 +0000 UTC m=+1964.523104986" watchObservedRunningTime="2025-12-03 11:36:20.697878266 +0000 UTC m=+1964.533806730" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.134724 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.210597 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-combined-ca-bundle\") pod \"82a7b79c-25b3-4bd5-ab40-d9654c454997\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.211040 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9cvz\" (UniqueName: \"kubernetes.io/projected/82a7b79c-25b3-4bd5-ab40-d9654c454997-kube-api-access-b9cvz\") pod \"82a7b79c-25b3-4bd5-ab40-d9654c454997\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.211211 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-scripts\") pod \"82a7b79c-25b3-4bd5-ab40-d9654c454997\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.211492 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-config-data\") pod \"82a7b79c-25b3-4bd5-ab40-d9654c454997\" (UID: \"82a7b79c-25b3-4bd5-ab40-d9654c454997\") " Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.219065 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a7b79c-25b3-4bd5-ab40-d9654c454997-kube-api-access-b9cvz" (OuterVolumeSpecName: "kube-api-access-b9cvz") pod "82a7b79c-25b3-4bd5-ab40-d9654c454997" (UID: "82a7b79c-25b3-4bd5-ab40-d9654c454997"). InnerVolumeSpecName "kube-api-access-b9cvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.220414 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-scripts" (OuterVolumeSpecName: "scripts") pod "82a7b79c-25b3-4bd5-ab40-d9654c454997" (UID: "82a7b79c-25b3-4bd5-ab40-d9654c454997"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.248367 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-config-data" (OuterVolumeSpecName: "config-data") pod "82a7b79c-25b3-4bd5-ab40-d9654c454997" (UID: "82a7b79c-25b3-4bd5-ab40-d9654c454997"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.248850 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82a7b79c-25b3-4bd5-ab40-d9654c454997" (UID: "82a7b79c-25b3-4bd5-ab40-d9654c454997"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.314500 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.314537 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.314552 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9cvz\" (UniqueName: \"kubernetes.io/projected/82a7b79c-25b3-4bd5-ab40-d9654c454997-kube-api-access-b9cvz\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.314562 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/82a7b79c-25b3-4bd5-ab40-d9654c454997-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.677030 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-6tgbz" event={"ID":"82a7b79c-25b3-4bd5-ab40-d9654c454997","Type":"ContainerDied","Data":"5dc4196c72637bc5c3182de45517accc26ce45dc7683e775abfff461662fb1a7"} Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.677088 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dc4196c72637bc5c3182de45517accc26ce45dc7683e775abfff461662fb1a7" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.677092 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-6tgbz" Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.680855 4702 generic.go:334] "Generic (PLEG): container finished" podID="90df2c2b-def7-4eda-896e-a551bfecb98c" containerID="3e3ac038ed09d37fa4d0b3db7fa93836d41613dafbec088b7fb8df5d15754c86" exitCode=0 Dec 03 11:36:22 crc kubenswrapper[4702]: I1203 11:36:22.680914 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" event={"ID":"90df2c2b-def7-4eda-896e-a551bfecb98c","Type":"ContainerDied","Data":"3e3ac038ed09d37fa4d0b3db7fa93836d41613dafbec088b7fb8df5d15754c86"} Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.454274 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.615323 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-inventory\") pod \"90df2c2b-def7-4eda-896e-a551bfecb98c\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.615435 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dfwv\" (UniqueName: \"kubernetes.io/projected/90df2c2b-def7-4eda-896e-a551bfecb98c-kube-api-access-4dfwv\") pod \"90df2c2b-def7-4eda-896e-a551bfecb98c\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.615508 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-ssh-key\") pod \"90df2c2b-def7-4eda-896e-a551bfecb98c\" (UID: \"90df2c2b-def7-4eda-896e-a551bfecb98c\") " Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.622061 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90df2c2b-def7-4eda-896e-a551bfecb98c-kube-api-access-4dfwv" (OuterVolumeSpecName: "kube-api-access-4dfwv") pod "90df2c2b-def7-4eda-896e-a551bfecb98c" (UID: "90df2c2b-def7-4eda-896e-a551bfecb98c"). InnerVolumeSpecName "kube-api-access-4dfwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.654253 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "90df2c2b-def7-4eda-896e-a551bfecb98c" (UID: "90df2c2b-def7-4eda-896e-a551bfecb98c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.657740 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-inventory" (OuterVolumeSpecName: "inventory") pod "90df2c2b-def7-4eda-896e-a551bfecb98c" (UID: "90df2c2b-def7-4eda-896e-a551bfecb98c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.707124 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" event={"ID":"90df2c2b-def7-4eda-896e-a551bfecb98c","Type":"ContainerDied","Data":"6da00662f393aff901fbaba996f120846bb81e40aaba0e8a1b663b1aad23ff16"} Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.707155 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-wjztq" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.707166 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6da00662f393aff901fbaba996f120846bb81e40aaba0e8a1b663b1aad23ff16" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.718455 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.718505 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dfwv\" (UniqueName: \"kubernetes.io/projected/90df2c2b-def7-4eda-896e-a551bfecb98c-kube-api-access-4dfwv\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.718520 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90df2c2b-def7-4eda-896e-a551bfecb98c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.776866 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l"] Dec 03 11:36:24 crc kubenswrapper[4702]: E1203 11:36:24.777550 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82a7b79c-25b3-4bd5-ab40-d9654c454997" containerName="aodh-db-sync" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.777569 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="82a7b79c-25b3-4bd5-ab40-d9654c454997" containerName="aodh-db-sync" Dec 03 11:36:24 crc kubenswrapper[4702]: E1203 11:36:24.777603 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90df2c2b-def7-4eda-896e-a551bfecb98c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.777612 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="90df2c2b-def7-4eda-896e-a551bfecb98c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.777875 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="82a7b79c-25b3-4bd5-ab40-d9654c454997" containerName="aodh-db-sync" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.777900 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="90df2c2b-def7-4eda-896e-a551bfecb98c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.778883 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.782044 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.782092 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.782649 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.782939 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.793008 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l"] Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.922711 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.922957 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.923185 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:24 crc kubenswrapper[4702]: I1203 11:36:24.923353 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnrkz\" (UniqueName: \"kubernetes.io/projected/ab0384ac-759e-45a9-99c3-39206af6a0b8-kube-api-access-vnrkz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.026170 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.026437 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.026552 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnrkz\" (UniqueName: \"kubernetes.io/projected/ab0384ac-759e-45a9-99c3-39206af6a0b8-kube-api-access-vnrkz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.027029 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.031960 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.038409 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.039334 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.044402 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnrkz\" (UniqueName: \"kubernetes.io/projected/ab0384ac-759e-45a9-99c3-39206af6a0b8-kube-api-access-vnrkz\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6584l\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.102917 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.622895 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l"] Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.723244 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" event={"ID":"ab0384ac-759e-45a9-99c3-39206af6a0b8","Type":"ContainerStarted","Data":"93cf03652465e02425197432f1dfa2aa6dbdeca81b4d61a79a9fab49511fc1bd"} Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.960552 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.961539 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-api" containerID="cri-o://bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943" gracePeriod=30 Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.961587 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-listener" containerID="cri-o://54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b" gracePeriod=30 Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.961851 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-notifier" containerID="cri-o://e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129" gracePeriod=30 Dec 03 11:36:25 crc kubenswrapper[4702]: I1203 11:36:25.961841 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-evaluator" containerID="cri-o://c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb" gracePeriod=30 Dec 03 11:36:26 crc kubenswrapper[4702]: I1203 11:36:26.798747 4702 generic.go:334] "Generic (PLEG): container finished" podID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerID="e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129" exitCode=0 Dec 03 11:36:26 crc kubenswrapper[4702]: I1203 11:36:26.799057 4702 generic.go:334] "Generic (PLEG): container finished" podID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerID="c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb" exitCode=0 Dec 03 11:36:26 crc kubenswrapper[4702]: I1203 11:36:26.799068 4702 generic.go:334] "Generic (PLEG): container finished" podID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerID="bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943" exitCode=0 Dec 03 11:36:26 crc kubenswrapper[4702]: I1203 11:36:26.798815 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerDied","Data":"e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129"} Dec 03 11:36:26 crc kubenswrapper[4702]: I1203 11:36:26.799158 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerDied","Data":"c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb"} Dec 03 11:36:26 crc kubenswrapper[4702]: I1203 11:36:26.799176 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerDied","Data":"bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943"} Dec 03 11:36:26 crc kubenswrapper[4702]: I1203 11:36:26.804899 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" event={"ID":"ab0384ac-759e-45a9-99c3-39206af6a0b8","Type":"ContainerStarted","Data":"d7a577bdbf34002e2606d4576d4cf5ffcf2835b414a8008dd1760af00d9e15dd"} Dec 03 11:36:26 crc kubenswrapper[4702]: I1203 11:36:26.837967 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" podStartSLOduration=2.370903849 podStartE2EDuration="2.837944712s" podCreationTimestamp="2025-12-03 11:36:24 +0000 UTC" firstStartedPulling="2025-12-03 11:36:25.634404831 +0000 UTC m=+1969.470333295" lastFinishedPulling="2025-12-03 11:36:26.101445694 +0000 UTC m=+1969.937374158" observedRunningTime="2025-12-03 11:36:26.82349152 +0000 UTC m=+1970.659419984" watchObservedRunningTime="2025-12-03 11:36:26.837944712 +0000 UTC m=+1970.673873176" Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.798531 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.847405 4702 generic.go:334] "Generic (PLEG): container finished" podID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerID="54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b" exitCode=0 Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.847492 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.847526 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerDied","Data":"54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b"} Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.847628 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"a7698d4e-71ca-4e83-995e-48f9a42c0490","Type":"ContainerDied","Data":"ced38c645dd8c7362f0e33c9d2777a25b8a0633f5c20ec9000fa0263fd957340"} Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.847660 4702 scope.go:117] "RemoveContainer" containerID="54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b" Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.902993 4702 scope.go:117] "RemoveContainer" containerID="e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129" Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.910423 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-combined-ca-bundle\") pod \"a7698d4e-71ca-4e83-995e-48f9a42c0490\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.910498 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-public-tls-certs\") pod \"a7698d4e-71ca-4e83-995e-48f9a42c0490\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.910627 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-scripts\") pod \"a7698d4e-71ca-4e83-995e-48f9a42c0490\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.910695 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-config-data\") pod \"a7698d4e-71ca-4e83-995e-48f9a42c0490\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.910817 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-internal-tls-certs\") pod \"a7698d4e-71ca-4e83-995e-48f9a42c0490\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.910973 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvtzc\" (UniqueName: \"kubernetes.io/projected/a7698d4e-71ca-4e83-995e-48f9a42c0490-kube-api-access-gvtzc\") pod \"a7698d4e-71ca-4e83-995e-48f9a42c0490\" (UID: \"a7698d4e-71ca-4e83-995e-48f9a42c0490\") " Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.929240 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-scripts" (OuterVolumeSpecName: "scripts") pod "a7698d4e-71ca-4e83-995e-48f9a42c0490" (UID: "a7698d4e-71ca-4e83-995e-48f9a42c0490"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:27 crc kubenswrapper[4702]: I1203 11:36:27.944138 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7698d4e-71ca-4e83-995e-48f9a42c0490-kube-api-access-gvtzc" (OuterVolumeSpecName: "kube-api-access-gvtzc") pod "a7698d4e-71ca-4e83-995e-48f9a42c0490" (UID: "a7698d4e-71ca-4e83-995e-48f9a42c0490"). InnerVolumeSpecName "kube-api-access-gvtzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.063161 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.063202 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvtzc\" (UniqueName: \"kubernetes.io/projected/a7698d4e-71ca-4e83-995e-48f9a42c0490-kube-api-access-gvtzc\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.083948 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a7698d4e-71ca-4e83-995e-48f9a42c0490" (UID: "a7698d4e-71ca-4e83-995e-48f9a42c0490"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.103216 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a7698d4e-71ca-4e83-995e-48f9a42c0490" (UID: "a7698d4e-71ca-4e83-995e-48f9a42c0490"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.165825 4702 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.165859 4702 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.211188 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7698d4e-71ca-4e83-995e-48f9a42c0490" (UID: "a7698d4e-71ca-4e83-995e-48f9a42c0490"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.244281 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-config-data" (OuterVolumeSpecName: "config-data") pod "a7698d4e-71ca-4e83-995e-48f9a42c0490" (UID: "a7698d4e-71ca-4e83-995e-48f9a42c0490"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.249997 4702 scope.go:117] "RemoveContainer" containerID="c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.269087 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.269142 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7698d4e-71ca-4e83-995e-48f9a42c0490-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.280798 4702 scope.go:117] "RemoveContainer" containerID="bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.317248 4702 scope.go:117] "RemoveContainer" containerID="54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b" Dec 03 11:36:28 crc kubenswrapper[4702]: E1203 11:36:28.317744 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b\": container with ID starting with 54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b not found: ID does not exist" containerID="54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.317831 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b"} err="failed to get container status \"54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b\": rpc error: code = NotFound desc = could not find container \"54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b\": container with ID starting with 54e4e28ef74e6dca85884bc5a31be0fd325934bec4268d6feece7f5d0484c56b not found: ID does not exist" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.317867 4702 scope.go:117] "RemoveContainer" containerID="e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129" Dec 03 11:36:28 crc kubenswrapper[4702]: E1203 11:36:28.320335 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129\": container with ID starting with e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129 not found: ID does not exist" containerID="e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.320377 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129"} err="failed to get container status \"e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129\": rpc error: code = NotFound desc = could not find container \"e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129\": container with ID starting with e1e147ca5c6fac1e409b4d80454cce890cac6b5141a7f89c254b9ffabc663129 not found: ID does not exist" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.320408 4702 scope.go:117] "RemoveContainer" containerID="c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb" Dec 03 11:36:28 crc kubenswrapper[4702]: E1203 11:36:28.321036 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb\": container with ID starting with c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb not found: ID does not exist" containerID="c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.321077 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb"} err="failed to get container status \"c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb\": rpc error: code = NotFound desc = could not find container \"c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb\": container with ID starting with c3aa67bc5e3f890c3cbddafa0097a3f9392a9adf35e512a96fe688d023d3d4fb not found: ID does not exist" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.321114 4702 scope.go:117] "RemoveContainer" containerID="bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943" Dec 03 11:36:28 crc kubenswrapper[4702]: E1203 11:36:28.321538 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943\": container with ID starting with bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943 not found: ID does not exist" containerID="bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.321666 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943"} err="failed to get container status \"bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943\": rpc error: code = NotFound desc = could not find container \"bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943\": container with ID starting with bd1bcaf26d2ead545e130cc9d4c95a9e7103ff967e024524dad0ef257a2c9943 not found: ID does not exist" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.789598 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.816280 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.835844 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 11:36:28 crc kubenswrapper[4702]: E1203 11:36:28.836646 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-notifier" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.836673 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-notifier" Dec 03 11:36:28 crc kubenswrapper[4702]: E1203 11:36:28.836688 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-evaluator" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.836694 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-evaluator" Dec 03 11:36:28 crc kubenswrapper[4702]: E1203 11:36:28.836824 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-api" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.836834 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-api" Dec 03 11:36:28 crc kubenswrapper[4702]: E1203 11:36:28.836867 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-listener" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.836875 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-listener" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.837138 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-notifier" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.837162 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-api" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.837176 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-evaluator" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.837203 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" containerName="aodh-listener" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.839838 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.844208 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-swrfb" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.844406 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.844793 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.844945 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.851373 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.867328 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.949791 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7698d4e-71ca-4e83-995e-48f9a42c0490" path="/var/lib/kubelet/pods/a7698d4e-71ca-4e83-995e-48f9a42c0490/volumes" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.954293 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-scripts\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.954386 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.954499 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-config-data\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:28 crc kubenswrapper[4702]: I1203 11:36:28.954552 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-internal-tls-certs\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.061133 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-public-tls-certs\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.061424 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-scripts\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.061509 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.061641 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-config-data\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.061683 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-internal-tls-certs\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.061785 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s878\" (UniqueName: \"kubernetes.io/projected/da05b0ac-b62d-4496-bbf4-0aa969a4def4-kube-api-access-5s878\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.108513 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-internal-tls-certs\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.116514 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-config-data\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.117965 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-scripts\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.144893 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.165823 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s878\" (UniqueName: \"kubernetes.io/projected/da05b0ac-b62d-4496-bbf4-0aa969a4def4-kube-api-access-5s878\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.165929 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-public-tls-certs\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.195677 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/da05b0ac-b62d-4496-bbf4-0aa969a4def4-public-tls-certs\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.235544 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s878\" (UniqueName: \"kubernetes.io/projected/da05b0ac-b62d-4496-bbf4-0aa969a4def4-kube-api-access-5s878\") pod \"aodh-0\" (UID: \"da05b0ac-b62d-4496-bbf4-0aa969a4def4\") " pod="openstack/aodh-0" Dec 03 11:36:29 crc kubenswrapper[4702]: I1203 11:36:29.459610 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 11:36:30 crc kubenswrapper[4702]: I1203 11:36:30.353276 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 11:36:30 crc kubenswrapper[4702]: I1203 11:36:30.929058 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:36:30 crc kubenswrapper[4702]: E1203 11:36:30.929661 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:36:30 crc kubenswrapper[4702]: I1203 11:36:30.958982 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da05b0ac-b62d-4496-bbf4-0aa969a4def4","Type":"ContainerStarted","Data":"176bb674294072c7208111d18db9204994c0db044e96f381b84efae6c4af2b5f"} Dec 03 11:36:32 crc kubenswrapper[4702]: I1203 11:36:32.963721 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da05b0ac-b62d-4496-bbf4-0aa969a4def4","Type":"ContainerStarted","Data":"72a9abd9bffc167e022f4d77f41b869b9c849daa05e45140e74742814df738bd"} Dec 03 11:36:35 crc kubenswrapper[4702]: I1203 11:36:35.005315 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da05b0ac-b62d-4496-bbf4-0aa969a4def4","Type":"ContainerStarted","Data":"053fdae892669557027bac10f33dc9376ee163966827a2c95f529941bdad9bb6"} Dec 03 11:36:36 crc kubenswrapper[4702]: I1203 11:36:36.028081 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da05b0ac-b62d-4496-bbf4-0aa969a4def4","Type":"ContainerStarted","Data":"51e1d901a5597e2a8c1f14cd16ff3bae4fe117d13fc7b9285620c0842ea93e5d"} Dec 03 11:36:38 crc kubenswrapper[4702]: I1203 11:36:38.129455 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"da05b0ac-b62d-4496-bbf4-0aa969a4def4","Type":"ContainerStarted","Data":"263c5619e3a152cd679ea7d7c0144b1c621abc34c1d10878686e60927a3d7539"} Dec 03 11:36:38 crc kubenswrapper[4702]: I1203 11:36:38.153786 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=3.618037063 podStartE2EDuration="10.153746557s" podCreationTimestamp="2025-12-03 11:36:28 +0000 UTC" firstStartedPulling="2025-12-03 11:36:30.346945608 +0000 UTC m=+1974.182874082" lastFinishedPulling="2025-12-03 11:36:36.882655122 +0000 UTC m=+1980.718583576" observedRunningTime="2025-12-03 11:36:38.147678274 +0000 UTC m=+1981.983606758" watchObservedRunningTime="2025-12-03 11:36:38.153746557 +0000 UTC m=+1981.989675021" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.705696 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sg4ct"] Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.709157 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.717352 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sg4ct"] Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.886072 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-utilities\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.886949 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbzng\" (UniqueName: \"kubernetes.io/projected/62fab233-efc1-4a34-a5ad-f833ff9ceec1-kube-api-access-jbzng\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.887123 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-catalog-content\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.989116 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbzng\" (UniqueName: \"kubernetes.io/projected/62fab233-efc1-4a34-a5ad-f833ff9ceec1-kube-api-access-jbzng\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.989192 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-catalog-content\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.989345 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-utilities\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.990010 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-utilities\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:40 crc kubenswrapper[4702]: I1203 11:36:40.990348 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-catalog-content\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:41 crc kubenswrapper[4702]: I1203 11:36:41.017690 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbzng\" (UniqueName: \"kubernetes.io/projected/62fab233-efc1-4a34-a5ad-f833ff9ceec1-kube-api-access-jbzng\") pod \"redhat-operators-sg4ct\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:41 crc kubenswrapper[4702]: I1203 11:36:41.045903 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:36:41 crc kubenswrapper[4702]: I1203 11:36:41.613022 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sg4ct"] Dec 03 11:36:41 crc kubenswrapper[4702]: W1203 11:36:41.615067 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62fab233_efc1_4a34_a5ad_f833ff9ceec1.slice/crio-480de75b92e1218f4d4dc8639ddcfe6b79ac6bb99555996e8cd073cf602110a6 WatchSource:0}: Error finding container 480de75b92e1218f4d4dc8639ddcfe6b79ac6bb99555996e8cd073cf602110a6: Status 404 returned error can't find the container with id 480de75b92e1218f4d4dc8639ddcfe6b79ac6bb99555996e8cd073cf602110a6 Dec 03 11:36:42 crc kubenswrapper[4702]: I1203 11:36:42.291788 4702 generic.go:334] "Generic (PLEG): container finished" podID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerID="f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac" exitCode=0 Dec 03 11:36:42 crc kubenswrapper[4702]: I1203 11:36:42.291912 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4ct" event={"ID":"62fab233-efc1-4a34-a5ad-f833ff9ceec1","Type":"ContainerDied","Data":"f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac"} Dec 03 11:36:42 crc kubenswrapper[4702]: I1203 11:36:42.292176 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4ct" event={"ID":"62fab233-efc1-4a34-a5ad-f833ff9ceec1","Type":"ContainerStarted","Data":"480de75b92e1218f4d4dc8639ddcfe6b79ac6bb99555996e8cd073cf602110a6"} Dec 03 11:36:44 crc kubenswrapper[4702]: I1203 11:36:44.560607 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4ct" event={"ID":"62fab233-efc1-4a34-a5ad-f833ff9ceec1","Type":"ContainerStarted","Data":"f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724"} Dec 03 11:36:45 crc kubenswrapper[4702]: I1203 11:36:45.929357 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:36:45 crc kubenswrapper[4702]: E1203 11:36:45.929800 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:36:52 crc kubenswrapper[4702]: I1203 11:36:52.763064 4702 generic.go:334] "Generic (PLEG): container finished" podID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerID="f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724" exitCode=0 Dec 03 11:36:52 crc kubenswrapper[4702]: I1203 11:36:52.763145 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4ct" event={"ID":"62fab233-efc1-4a34-a5ad-f833ff9ceec1","Type":"ContainerDied","Data":"f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724"} Dec 03 11:36:53 crc kubenswrapper[4702]: I1203 11:36:53.780552 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4ct" event={"ID":"62fab233-efc1-4a34-a5ad-f833ff9ceec1","Type":"ContainerStarted","Data":"47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8"} Dec 03 11:36:53 crc kubenswrapper[4702]: I1203 11:36:53.809704 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sg4ct" podStartSLOduration=2.910490461 podStartE2EDuration="13.80967749s" podCreationTimestamp="2025-12-03 11:36:40 +0000 UTC" firstStartedPulling="2025-12-03 11:36:42.294525457 +0000 UTC m=+1986.130453931" lastFinishedPulling="2025-12-03 11:36:53.193712496 +0000 UTC m=+1997.029640960" observedRunningTime="2025-12-03 11:36:53.807564149 +0000 UTC m=+1997.643492613" watchObservedRunningTime="2025-12-03 11:36:53.80967749 +0000 UTC m=+1997.645605954" Dec 03 11:36:56 crc kubenswrapper[4702]: I1203 11:36:56.946141 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:36:56 crc kubenswrapper[4702]: E1203 11:36:56.947117 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:37:00 crc kubenswrapper[4702]: I1203 11:37:00.382728 4702 scope.go:117] "RemoveContainer" containerID="e8d57bec2cdcf4deaf9da6d8b84928fdde30e921d3658e1bfe81705e6a8659b4" Dec 03 11:37:01 crc kubenswrapper[4702]: I1203 11:37:01.046180 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:37:01 crc kubenswrapper[4702]: I1203 11:37:01.046497 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:37:02 crc kubenswrapper[4702]: I1203 11:37:02.100092 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-sg4ct" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="registry-server" probeResult="failure" output=< Dec 03 11:37:02 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:37:02 crc kubenswrapper[4702]: > Dec 03 11:37:07 crc kubenswrapper[4702]: I1203 11:37:07.939620 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:37:07 crc kubenswrapper[4702]: E1203 11:37:07.942901 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:37:11 crc kubenswrapper[4702]: I1203 11:37:11.105932 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:37:11 crc kubenswrapper[4702]: I1203 11:37:11.164893 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:37:11 crc kubenswrapper[4702]: I1203 11:37:11.905712 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sg4ct"] Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.041062 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sg4ct" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="registry-server" containerID="cri-o://47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8" gracePeriod=2 Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.585596 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.754044 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-utilities\") pod \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.754119 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-catalog-content\") pod \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.754257 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbzng\" (UniqueName: \"kubernetes.io/projected/62fab233-efc1-4a34-a5ad-f833ff9ceec1-kube-api-access-jbzng\") pod \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\" (UID: \"62fab233-efc1-4a34-a5ad-f833ff9ceec1\") " Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.755362 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-utilities" (OuterVolumeSpecName: "utilities") pod "62fab233-efc1-4a34-a5ad-f833ff9ceec1" (UID: "62fab233-efc1-4a34-a5ad-f833ff9ceec1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.762158 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fab233-efc1-4a34-a5ad-f833ff9ceec1-kube-api-access-jbzng" (OuterVolumeSpecName: "kube-api-access-jbzng") pod "62fab233-efc1-4a34-a5ad-f833ff9ceec1" (UID: "62fab233-efc1-4a34-a5ad-f833ff9ceec1"). InnerVolumeSpecName "kube-api-access-jbzng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.857430 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.857498 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbzng\" (UniqueName: \"kubernetes.io/projected/62fab233-efc1-4a34-a5ad-f833ff9ceec1-kube-api-access-jbzng\") on node \"crc\" DevicePath \"\"" Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.908904 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62fab233-efc1-4a34-a5ad-f833ff9ceec1" (UID: "62fab233-efc1-4a34-a5ad-f833ff9ceec1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:37:13 crc kubenswrapper[4702]: I1203 11:37:13.960487 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62fab233-efc1-4a34-a5ad-f833ff9ceec1-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.054997 4702 generic.go:334] "Generic (PLEG): container finished" podID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerID="47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8" exitCode=0 Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.055059 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sg4ct" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.055056 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4ct" event={"ID":"62fab233-efc1-4a34-a5ad-f833ff9ceec1","Type":"ContainerDied","Data":"47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8"} Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.055113 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sg4ct" event={"ID":"62fab233-efc1-4a34-a5ad-f833ff9ceec1","Type":"ContainerDied","Data":"480de75b92e1218f4d4dc8639ddcfe6b79ac6bb99555996e8cd073cf602110a6"} Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.055146 4702 scope.go:117] "RemoveContainer" containerID="47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.083128 4702 scope.go:117] "RemoveContainer" containerID="f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.097249 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sg4ct"] Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.109104 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sg4ct"] Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.143899 4702 scope.go:117] "RemoveContainer" containerID="f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.180753 4702 scope.go:117] "RemoveContainer" containerID="47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8" Dec 03 11:37:14 crc kubenswrapper[4702]: E1203 11:37:14.181664 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8\": container with ID starting with 47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8 not found: ID does not exist" containerID="47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.181748 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8"} err="failed to get container status \"47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8\": rpc error: code = NotFound desc = could not find container \"47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8\": container with ID starting with 47cb4378363b223e629bcec21549d7cf010a5a3e33f4645fb4d8f3b4807782b8 not found: ID does not exist" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.182024 4702 scope.go:117] "RemoveContainer" containerID="f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724" Dec 03 11:37:14 crc kubenswrapper[4702]: E1203 11:37:14.182462 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724\": container with ID starting with f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724 not found: ID does not exist" containerID="f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.182588 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724"} err="failed to get container status \"f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724\": rpc error: code = NotFound desc = could not find container \"f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724\": container with ID starting with f0478a9901a3c558df93bdb6dc31f9c5e22dcba77b1388aff94081b450346724 not found: ID does not exist" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.182689 4702 scope.go:117] "RemoveContainer" containerID="f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac" Dec 03 11:37:14 crc kubenswrapper[4702]: E1203 11:37:14.184130 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac\": container with ID starting with f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac not found: ID does not exist" containerID="f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.184256 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac"} err="failed to get container status \"f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac\": rpc error: code = NotFound desc = could not find container \"f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac\": container with ID starting with f21ef6fc21b164b601bb507b8658a11b9c5181a5e129e60d10db684a0002a7ac not found: ID does not exist" Dec 03 11:37:14 crc kubenswrapper[4702]: I1203 11:37:14.943856 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" path="/var/lib/kubelet/pods/62fab233-efc1-4a34-a5ad-f833ff9ceec1/volumes" Dec 03 11:37:20 crc kubenswrapper[4702]: I1203 11:37:20.929257 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:37:20 crc kubenswrapper[4702]: E1203 11:37:20.930172 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:37:35 crc kubenswrapper[4702]: I1203 11:37:35.929588 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:37:37 crc kubenswrapper[4702]: I1203 11:37:37.365163 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"39905d80302a00685e55c3f28d423b218acb74c324103c6641753cc0d7704b72"} Dec 03 11:38:00 crc kubenswrapper[4702]: I1203 11:38:00.727512 4702 scope.go:117] "RemoveContainer" containerID="5a726c16dfdd16020fe752214fa8a86d2405848c57001073dc67ff2799b8ffb9" Dec 03 11:38:00 crc kubenswrapper[4702]: I1203 11:38:00.772071 4702 scope.go:117] "RemoveContainer" containerID="beb14e6e68c81110a37a5577e4ccedc4cc03bc33e7b8ba69d46f559f82443abe" Dec 03 11:38:00 crc kubenswrapper[4702]: I1203 11:38:00.828398 4702 scope.go:117] "RemoveContainer" containerID="ed8192353a8e24d51a5100f448a7dc9fafc56909816663b856e0e043d99d18bd" Dec 03 11:38:12 crc kubenswrapper[4702]: I1203 11:38:12.640922 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 11:38:12 crc kubenswrapper[4702]: I1203 11:38:12.644608 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 11:38:14 crc kubenswrapper[4702]: I1203 11:38:14.091154 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-f62wv"] Dec 03 11:38:14 crc kubenswrapper[4702]: I1203 11:38:14.103371 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-f62wv"] Dec 03 11:38:14 crc kubenswrapper[4702]: I1203 11:38:14.954111 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e688d41-fb86-42a2-ae40-57d585b44357" path="/var/lib/kubelet/pods/9e688d41-fb86-42a2-ae40-57d585b44357/volumes" Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.481493 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-fm24s"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.504239 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-e7ea-account-create-update-mb6qb"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.519140 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-k57x9"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.536997 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-c7c0-account-create-update-qf8t8"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.551228 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-e7ea-account-create-update-mb6qb"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.562749 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-k57x9"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.584378 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-fm24s"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.601970 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-c7c0-account-create-update-qf8t8"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.615254 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-kg7cx"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.628516 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6217-account-create-update-tst4w"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.644870 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d28a-account-create-update-25ttw"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.657550 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-kg7cx"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.671204 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6217-account-create-update-tst4w"] Dec 03 11:38:15 crc kubenswrapper[4702]: I1203 11:38:15.692326 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d28a-account-create-update-25ttw"] Dec 03 11:38:16 crc kubenswrapper[4702]: I1203 11:38:16.946862 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04333214-a420-40c6-bcd4-0d50544955ec" path="/var/lib/kubelet/pods/04333214-a420-40c6-bcd4-0d50544955ec/volumes" Dec 03 11:38:16 crc kubenswrapper[4702]: I1203 11:38:16.949440 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d" path="/var/lib/kubelet/pods/1e7f42b9-f1fb-49b2-a5f9-ffab3fc3ae3d/volumes" Dec 03 11:38:16 crc kubenswrapper[4702]: I1203 11:38:16.951308 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4302e064-fe40-4f25-aeb5-44b7e6449131" path="/var/lib/kubelet/pods/4302e064-fe40-4f25-aeb5-44b7e6449131/volumes" Dec 03 11:38:16 crc kubenswrapper[4702]: I1203 11:38:16.952540 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cdc33be-f901-4659-a687-a547acd212c0" path="/var/lib/kubelet/pods/6cdc33be-f901-4659-a687-a547acd212c0/volumes" Dec 03 11:38:16 crc kubenswrapper[4702]: I1203 11:38:16.954244 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c1760e7-8de5-48fc-af90-9e8dedf53a3c" path="/var/lib/kubelet/pods/8c1760e7-8de5-48fc-af90-9e8dedf53a3c/volumes" Dec 03 11:38:16 crc kubenswrapper[4702]: I1203 11:38:16.955190 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910db445-4971-456d-8d38-099ce65627ba" path="/var/lib/kubelet/pods/910db445-4971-456d-8d38-099ce65627ba/volumes" Dec 03 11:38:16 crc kubenswrapper[4702]: I1203 11:38:16.956413 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d72b70-ab4b-44db-a489-5daf18efbf68" path="/var/lib/kubelet/pods/b7d72b70-ab4b-44db-a489-5daf18efbf68/volumes" Dec 03 11:38:29 crc kubenswrapper[4702]: I1203 11:38:29.063293 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5"] Dec 03 11:38:29 crc kubenswrapper[4702]: I1203 11:38:29.080314 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-gfmp5"] Dec 03 11:38:30 crc kubenswrapper[4702]: I1203 11:38:30.941215 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e712d5-563a-475e-9811-f8e005cee912" path="/var/lib/kubelet/pods/17e712d5-563a-475e-9811-f8e005cee912/volumes" Dec 03 11:38:42 crc kubenswrapper[4702]: I1203 11:38:42.038531 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-b754-account-create-update-k7lzh"] Dec 03 11:38:42 crc kubenswrapper[4702]: I1203 11:38:42.056727 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-b754-account-create-update-k7lzh"] Dec 03 11:38:42 crc kubenswrapper[4702]: I1203 11:38:42.941916 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b687a4-b074-4490-bb5a-c218ad3f6c9c" path="/var/lib/kubelet/pods/e3b687a4-b074-4490-bb5a-c218ad3f6c9c/volumes" Dec 03 11:38:47 crc kubenswrapper[4702]: I1203 11:38:47.037682 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-nccll"] Dec 03 11:38:47 crc kubenswrapper[4702]: I1203 11:38:47.050517 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-cae4-account-create-update-l5sfj"] Dec 03 11:38:47 crc kubenswrapper[4702]: I1203 11:38:47.062593 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-5cea-account-create-update-mjzq9"] Dec 03 11:38:47 crc kubenswrapper[4702]: I1203 11:38:47.073906 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-nccll"] Dec 03 11:38:47 crc kubenswrapper[4702]: I1203 11:38:47.097092 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-5cea-account-create-update-mjzq9"] Dec 03 11:38:47 crc kubenswrapper[4702]: I1203 11:38:47.124734 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-cae4-account-create-update-l5sfj"] Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.249449 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fv4ks"] Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.305681 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fv4ks"] Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.317410 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nd8sv"] Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.330070 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-pdhc7"] Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.343846 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nd8sv"] Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.357583 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-pdhc7"] Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.945788 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400f59d1-8b01-45e9-84b0-173a5fa761d5" path="/var/lib/kubelet/pods/400f59d1-8b01-45e9-84b0-173a5fa761d5/volumes" Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.947688 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d7de17e-a277-4ec1-b0ca-d45be9fd054c" path="/var/lib/kubelet/pods/5d7de17e-a277-4ec1-b0ca-d45be9fd054c/volumes" Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.948748 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ccf575d-baf1-476f-bcba-41c45119c970" path="/var/lib/kubelet/pods/7ccf575d-baf1-476f-bcba-41c45119c970/volumes" Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.949746 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae1887ec-19fc-43a3-ab93-481f83e4a190" path="/var/lib/kubelet/pods/ae1887ec-19fc-43a3-ab93-481f83e4a190/volumes" Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.951736 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67ff48c-39fd-461b-8858-56bd0150cb46" path="/var/lib/kubelet/pods/e67ff48c-39fd-461b-8858-56bd0150cb46/volumes" Dec 03 11:38:48 crc kubenswrapper[4702]: I1203 11:38:48.952788 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb0e1ee0-9045-4d23-85fb-1a79b18242c4" path="/var/lib/kubelet/pods/eb0e1ee0-9045-4d23-85fb-1a79b18242c4/volumes" Dec 03 11:38:49 crc kubenswrapper[4702]: I1203 11:38:49.049904 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-890e-account-create-update-krpq7"] Dec 03 11:38:49 crc kubenswrapper[4702]: I1203 11:38:49.065604 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-890e-account-create-update-krpq7"] Dec 03 11:38:50 crc kubenswrapper[4702]: I1203 11:38:50.059677 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-3858-account-create-update-58hz9"] Dec 03 11:38:50 crc kubenswrapper[4702]: I1203 11:38:50.098445 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-3858-account-create-update-58hz9"] Dec 03 11:38:50 crc kubenswrapper[4702]: I1203 11:38:50.948085 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="654aa8a0-852c-4aae-b72d-0ca4eb991a77" path="/var/lib/kubelet/pods/654aa8a0-852c-4aae-b72d-0ca4eb991a77/volumes" Dec 03 11:38:50 crc kubenswrapper[4702]: I1203 11:38:50.949388 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a867bd18-48a7-462b-b4ed-6d103ccf80bf" path="/var/lib/kubelet/pods/a867bd18-48a7-462b-b4ed-6d103ccf80bf/volumes" Dec 03 11:39:00 crc kubenswrapper[4702]: I1203 11:39:00.992625 4702 scope.go:117] "RemoveContainer" containerID="d98a7e18305ac71f1b26770879f29b0af16f063edbd6f7bd7d197691baaec754" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.387869 4702 scope.go:117] "RemoveContainer" containerID="9237a9ded8ef52a118f139125f96cd624fb1a2f0f71f756cd1828e5742393505" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.432840 4702 scope.go:117] "RemoveContainer" containerID="c357dac314c39eed2710ff6a6c4eefee442389e5f1ab6a469d9eb71728c632c4" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.490241 4702 scope.go:117] "RemoveContainer" containerID="5f305242b6a0f1936c26115468933c1744dc34f1280701658a7afd5a519f77e1" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.549599 4702 scope.go:117] "RemoveContainer" containerID="1b8a6d044dac9d7e14b452174d79dc85aceba5ff4df34d6f9693c9c16ee6f437" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.606842 4702 scope.go:117] "RemoveContainer" containerID="2ceca0a7b4c165f5de9d880d7f78c34a7795ffaac84447aaed12b758fbf597de" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.662529 4702 scope.go:117] "RemoveContainer" containerID="d200e8f2820fd0a42d99e646911887c1b26aaf9263be632e0d32ee7b56e5a041" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.697958 4702 scope.go:117] "RemoveContainer" containerID="51de0a68602ee9718d493d4f0bcaae5a4171a08de5ab53b9bd4ce9779ff4c4e3" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.743733 4702 scope.go:117] "RemoveContainer" containerID="4889216c64cf85d1cc219dc1dd0cd7afeda3c3f919c0c9682402f61aa13bff66" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.770399 4702 scope.go:117] "RemoveContainer" containerID="a39d34c4bd22f9121f011972d8b465ac5d6f4594708a6881637d54378c8fb867" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.801816 4702 scope.go:117] "RemoveContainer" containerID="00be2d9c2d5b19e1b0781c7d0b2359e2733ea150a4e65bfad6ea6f85acf5c3d0" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.845847 4702 scope.go:117] "RemoveContainer" containerID="c6e3a0909696a8b53bdedf99ff8321d41401e621eba80823eac63dd8fffdcd19" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.873002 4702 scope.go:117] "RemoveContainer" containerID="d524df9e1fc3053d38f3ba2633d6aa33df90732620b04d416fc9589069f31f93" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.901117 4702 scope.go:117] "RemoveContainer" containerID="092848727cd1175a91b12c135ef3464d9b083cb15ee1605161dace487d55e8d8" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.930954 4702 scope.go:117] "RemoveContainer" containerID="03736fd66d9315f20e39020018954b4db7f94210143895c30e278d4b86796337" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.961185 4702 scope.go:117] "RemoveContainer" containerID="11ac08548549e0f609d14ea3b47fc17f9628c005ad96b2cdef45524263932043" Dec 03 11:39:01 crc kubenswrapper[4702]: I1203 11:39:01.990719 4702 scope.go:117] "RemoveContainer" containerID="7aaa90fc8aa5bb535d369a89af9f3c4141012d2247ec20ec2a2986df0de06c93" Dec 03 11:39:02 crc kubenswrapper[4702]: I1203 11:39:02.021011 4702 scope.go:117] "RemoveContainer" containerID="38cd2cfc39731f342fb3349689ca00d5bc30d60097d0344fb6d532517716b8a9" Dec 03 11:39:02 crc kubenswrapper[4702]: I1203 11:39:02.048235 4702 scope.go:117] "RemoveContainer" containerID="1f9b7194d4479966285d927fb3db55ba0cb1f08ea30d8b5744c05eca65449218" Dec 03 11:39:15 crc kubenswrapper[4702]: I1203 11:39:15.048112 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jm662"] Dec 03 11:39:15 crc kubenswrapper[4702]: I1203 11:39:15.097974 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jm662"] Dec 03 11:39:16 crc kubenswrapper[4702]: I1203 11:39:16.942672 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2598e5-2410-4a5b-b8b5-f41239c131d1" path="/var/lib/kubelet/pods/3e2598e5-2410-4a5b-b8b5-f41239c131d1/volumes" Dec 03 11:39:34 crc kubenswrapper[4702]: I1203 11:39:34.045708 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hq9rm"] Dec 03 11:39:34 crc kubenswrapper[4702]: I1203 11:39:34.057913 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hq9rm"] Dec 03 11:39:34 crc kubenswrapper[4702]: I1203 11:39:34.942796 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d869ae-eae9-4bf6-b05e-cf40504ccdb6" path="/var/lib/kubelet/pods/34d869ae-eae9-4bf6-b05e-cf40504ccdb6/volumes" Dec 03 11:39:55 crc kubenswrapper[4702]: I1203 11:39:55.907867 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:39:55 crc kubenswrapper[4702]: I1203 11:39:55.908574 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:39:56 crc kubenswrapper[4702]: I1203 11:39:56.046617 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-w8wgv"] Dec 03 11:39:56 crc kubenswrapper[4702]: I1203 11:39:56.059091 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-w8wgv"] Dec 03 11:39:56 crc kubenswrapper[4702]: I1203 11:39:56.946964 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32433fde-8c4c-43af-94f5-ab732096cd9d" path="/var/lib/kubelet/pods/32433fde-8c4c-43af-94f5-ab732096cd9d/volumes" Dec 03 11:40:02 crc kubenswrapper[4702]: I1203 11:40:02.635986 4702 scope.go:117] "RemoveContainer" containerID="cc7bab6c4bcdb67bd28ac08cc403f78ca5b1748a0cc502b406e0ff94a7278db1" Dec 03 11:40:02 crc kubenswrapper[4702]: I1203 11:40:02.687188 4702 scope.go:117] "RemoveContainer" containerID="87d477f984f9300b594e99220e09af0f886c163b126f4c536f666d04ad1a8659" Dec 03 11:40:02 crc kubenswrapper[4702]: I1203 11:40:02.747383 4702 scope.go:117] "RemoveContainer" containerID="908c6ae29e4601831cec06aa54d973978d1b4b264c09734f0ec062f753d8b827" Dec 03 11:40:02 crc kubenswrapper[4702]: I1203 11:40:02.790706 4702 scope.go:117] "RemoveContainer" containerID="e0cb0c5f34421fd7d055849726b234e30a1295b20d66bf6adc32dd351bc8adbe" Dec 03 11:40:02 crc kubenswrapper[4702]: I1203 11:40:02.828803 4702 scope.go:117] "RemoveContainer" containerID="4e76021073dc4cde55830752f455be7fa7b9d5c261db44cce0d4efddd2d225bf" Dec 03 11:40:03 crc kubenswrapper[4702]: I1203 11:40:03.057539 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jtqd2"] Dec 03 11:40:03 crc kubenswrapper[4702]: I1203 11:40:03.073841 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jtqd2"] Dec 03 11:40:04 crc kubenswrapper[4702]: I1203 11:40:04.945376 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ca96186-ab1f-41b3-a9f4-c89220b757da" path="/var/lib/kubelet/pods/0ca96186-ab1f-41b3-a9f4-c89220b757da/volumes" Dec 03 11:40:06 crc kubenswrapper[4702]: I1203 11:40:06.032336 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-ds27t"] Dec 03 11:40:06 crc kubenswrapper[4702]: I1203 11:40:06.043438 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-ds27t"] Dec 03 11:40:06 crc kubenswrapper[4702]: I1203 11:40:06.943497 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d4cceb-3a17-486b-8718-897e52ea39cc" path="/var/lib/kubelet/pods/c2d4cceb-3a17-486b-8718-897e52ea39cc/volumes" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.546373 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7ntqc"] Dec 03 11:40:12 crc kubenswrapper[4702]: E1203 11:40:12.547627 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="registry-server" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.547661 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="registry-server" Dec 03 11:40:12 crc kubenswrapper[4702]: E1203 11:40:12.547697 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="extract-content" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.547708 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="extract-content" Dec 03 11:40:12 crc kubenswrapper[4702]: E1203 11:40:12.547871 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="extract-utilities" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.547886 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="extract-utilities" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.548275 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="62fab233-efc1-4a34-a5ad-f833ff9ceec1" containerName="registry-server" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.550313 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.569603 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38c7c63b-db59-4055-aee0-99ea082bd8f7-utilities\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.569742 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38c7c63b-db59-4055-aee0-99ea082bd8f7-catalog-content\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.569885 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4695\" (UniqueName: \"kubernetes.io/projected/38c7c63b-db59-4055-aee0-99ea082bd8f7-kube-api-access-n4695\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.573109 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ntqc"] Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.673100 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38c7c63b-db59-4055-aee0-99ea082bd8f7-utilities\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.673507 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38c7c63b-db59-4055-aee0-99ea082bd8f7-catalog-content\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.673665 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4695\" (UniqueName: \"kubernetes.io/projected/38c7c63b-db59-4055-aee0-99ea082bd8f7-kube-api-access-n4695\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.673839 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38c7c63b-db59-4055-aee0-99ea082bd8f7-utilities\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.674097 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38c7c63b-db59-4055-aee0-99ea082bd8f7-catalog-content\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.701998 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4695\" (UniqueName: \"kubernetes.io/projected/38c7c63b-db59-4055-aee0-99ea082bd8f7-kube-api-access-n4695\") pod \"certified-operators-7ntqc\" (UID: \"38c7c63b-db59-4055-aee0-99ea082bd8f7\") " pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:12 crc kubenswrapper[4702]: I1203 11:40:12.891983 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:13 crc kubenswrapper[4702]: I1203 11:40:13.045217 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r7v59"] Dec 03 11:40:13 crc kubenswrapper[4702]: I1203 11:40:13.063094 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r7v59"] Dec 03 11:40:13 crc kubenswrapper[4702]: I1203 11:40:13.474371 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ntqc"] Dec 03 11:40:13 crc kubenswrapper[4702]: I1203 11:40:13.743431 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ntqc" event={"ID":"38c7c63b-db59-4055-aee0-99ea082bd8f7","Type":"ContainerStarted","Data":"94488fae89033a22fd9067e6c5bc94de49753f1cc9669643bdb9a967ce1fed5f"} Dec 03 11:40:14 crc kubenswrapper[4702]: I1203 11:40:14.757910 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" event={"ID":"ab0384ac-759e-45a9-99c3-39206af6a0b8","Type":"ContainerDied","Data":"d7a577bdbf34002e2606d4576d4cf5ffcf2835b414a8008dd1760af00d9e15dd"} Dec 03 11:40:14 crc kubenswrapper[4702]: I1203 11:40:14.757873 4702 generic.go:334] "Generic (PLEG): container finished" podID="ab0384ac-759e-45a9-99c3-39206af6a0b8" containerID="d7a577bdbf34002e2606d4576d4cf5ffcf2835b414a8008dd1760af00d9e15dd" exitCode=0 Dec 03 11:40:14 crc kubenswrapper[4702]: I1203 11:40:14.760981 4702 generic.go:334] "Generic (PLEG): container finished" podID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerID="d3e7b208b58875cd5c33fa0470fcb850097a954baac3cdecf3dfc22e0b0d7842" exitCode=0 Dec 03 11:40:14 crc kubenswrapper[4702]: I1203 11:40:14.761043 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ntqc" event={"ID":"38c7c63b-db59-4055-aee0-99ea082bd8f7","Type":"ContainerDied","Data":"d3e7b208b58875cd5c33fa0470fcb850097a954baac3cdecf3dfc22e0b0d7842"} Dec 03 11:40:14 crc kubenswrapper[4702]: I1203 11:40:14.943154 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65cdf028-89d6-4f6e-8aae-bdf5e8264310" path="/var/lib/kubelet/pods/65cdf028-89d6-4f6e-8aae-bdf5e8264310/volumes" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.343822 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.399016 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnrkz\" (UniqueName: \"kubernetes.io/projected/ab0384ac-759e-45a9-99c3-39206af6a0b8-kube-api-access-vnrkz\") pod \"ab0384ac-759e-45a9-99c3-39206af6a0b8\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.399206 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-bootstrap-combined-ca-bundle\") pod \"ab0384ac-759e-45a9-99c3-39206af6a0b8\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.399316 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-ssh-key\") pod \"ab0384ac-759e-45a9-99c3-39206af6a0b8\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.399340 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-inventory\") pod \"ab0384ac-759e-45a9-99c3-39206af6a0b8\" (UID: \"ab0384ac-759e-45a9-99c3-39206af6a0b8\") " Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.409044 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ab0384ac-759e-45a9-99c3-39206af6a0b8" (UID: "ab0384ac-759e-45a9-99c3-39206af6a0b8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.409161 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab0384ac-759e-45a9-99c3-39206af6a0b8-kube-api-access-vnrkz" (OuterVolumeSpecName: "kube-api-access-vnrkz") pod "ab0384ac-759e-45a9-99c3-39206af6a0b8" (UID: "ab0384ac-759e-45a9-99c3-39206af6a0b8"). InnerVolumeSpecName "kube-api-access-vnrkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.441641 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-inventory" (OuterVolumeSpecName: "inventory") pod "ab0384ac-759e-45a9-99c3-39206af6a0b8" (UID: "ab0384ac-759e-45a9-99c3-39206af6a0b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.446797 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ab0384ac-759e-45a9-99c3-39206af6a0b8" (UID: "ab0384ac-759e-45a9-99c3-39206af6a0b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.504493 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnrkz\" (UniqueName: \"kubernetes.io/projected/ab0384ac-759e-45a9-99c3-39206af6a0b8-kube-api-access-vnrkz\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.504534 4702 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.504550 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.504563 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab0384ac-759e-45a9-99c3-39206af6a0b8-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.787650 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" event={"ID":"ab0384ac-759e-45a9-99c3-39206af6a0b8","Type":"ContainerDied","Data":"93cf03652465e02425197432f1dfa2aa6dbdeca81b4d61a79a9fab49511fc1bd"} Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.787730 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93cf03652465e02425197432f1dfa2aa6dbdeca81b4d61a79a9fab49511fc1bd" Dec 03 11:40:16 crc kubenswrapper[4702]: I1203 11:40:16.787688 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6584l" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.015250 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr"] Dec 03 11:40:17 crc kubenswrapper[4702]: E1203 11:40:17.054698 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab0384ac-759e-45a9-99c3-39206af6a0b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.054746 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab0384ac-759e-45a9-99c3-39206af6a0b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.055637 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab0384ac-759e-45a9-99c3-39206af6a0b8" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.057176 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr"] Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.057312 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.064415 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.065298 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.074142 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.074707 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.180078 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rbl7\" (UniqueName: \"kubernetes.io/projected/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-kube-api-access-8rbl7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.181289 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.181434 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.284059 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rbl7\" (UniqueName: \"kubernetes.io/projected/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-kube-api-access-8rbl7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.284207 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.284226 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.297422 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.297477 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.303175 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rbl7\" (UniqueName: \"kubernetes.io/projected/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-kube-api-access-8rbl7\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:17 crc kubenswrapper[4702]: I1203 11:40:17.422889 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:40:18 crc kubenswrapper[4702]: I1203 11:40:18.098711 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr"] Dec 03 11:40:19 crc kubenswrapper[4702]: I1203 11:40:19.826068 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" event={"ID":"e6a36b67-a7da-4684-8b3a-57735f2e4c8d","Type":"ContainerStarted","Data":"09e77122697bac538a52399b41612e938ce474626f06a32f325028b07582c119"} Dec 03 11:40:20 crc kubenswrapper[4702]: I1203 11:40:20.839833 4702 generic.go:334] "Generic (PLEG): container finished" podID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerID="796bf31377b27bfce2cd5dc5f4e06a5bf6b776c48a44e2a2986936386ac7073c" exitCode=0 Dec 03 11:40:20 crc kubenswrapper[4702]: I1203 11:40:20.839910 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ntqc" event={"ID":"38c7c63b-db59-4055-aee0-99ea082bd8f7","Type":"ContainerDied","Data":"796bf31377b27bfce2cd5dc5f4e06a5bf6b776c48a44e2a2986936386ac7073c"} Dec 03 11:40:21 crc kubenswrapper[4702]: I1203 11:40:21.853181 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" event={"ID":"e6a36b67-a7da-4684-8b3a-57735f2e4c8d","Type":"ContainerStarted","Data":"c1d719d2c2af4e33cfbf2df5db8f3d7f6ef9af024124d40d82b94a2af9f5ba12"} Dec 03 11:40:21 crc kubenswrapper[4702]: I1203 11:40:21.876101 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" podStartSLOduration=5.048837935 podStartE2EDuration="5.87605962s" podCreationTimestamp="2025-12-03 11:40:16 +0000 UTC" firstStartedPulling="2025-12-03 11:40:19.679004354 +0000 UTC m=+2203.514932818" lastFinishedPulling="2025-12-03 11:40:20.506226039 +0000 UTC m=+2204.342154503" observedRunningTime="2025-12-03 11:40:21.87043675 +0000 UTC m=+2205.706365204" watchObservedRunningTime="2025-12-03 11:40:21.87605962 +0000 UTC m=+2205.711988094" Dec 03 11:40:22 crc kubenswrapper[4702]: I1203 11:40:22.866400 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ntqc" event={"ID":"38c7c63b-db59-4055-aee0-99ea082bd8f7","Type":"ContainerStarted","Data":"a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10"} Dec 03 11:40:22 crc kubenswrapper[4702]: I1203 11:40:22.892423 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:22 crc kubenswrapper[4702]: I1203 11:40:22.892494 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:22 crc kubenswrapper[4702]: I1203 11:40:22.895623 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7ntqc" podStartSLOduration=3.539786382 podStartE2EDuration="10.895599675s" podCreationTimestamp="2025-12-03 11:40:12 +0000 UTC" firstStartedPulling="2025-12-03 11:40:14.766007079 +0000 UTC m=+2198.601935543" lastFinishedPulling="2025-12-03 11:40:22.121820382 +0000 UTC m=+2205.957748836" observedRunningTime="2025-12-03 11:40:22.890273273 +0000 UTC m=+2206.726201747" watchObservedRunningTime="2025-12-03 11:40:22.895599675 +0000 UTC m=+2206.731528139" Dec 03 11:40:23 crc kubenswrapper[4702]: I1203 11:40:23.962139 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 11:40:23 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:40:23 crc kubenswrapper[4702]: > Dec 03 11:40:25 crc kubenswrapper[4702]: I1203 11:40:25.907898 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:40:25 crc kubenswrapper[4702]: I1203 11:40:25.908231 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:40:32 crc kubenswrapper[4702]: I1203 11:40:32.971244 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:33 crc kubenswrapper[4702]: I1203 11:40:33.051666 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 11:40:33 crc kubenswrapper[4702]: I1203 11:40:33.192158 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ntqc"] Dec 03 11:40:33 crc kubenswrapper[4702]: I1203 11:40:33.273268 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g44ws"] Dec 03 11:40:33 crc kubenswrapper[4702]: I1203 11:40:33.274007 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g44ws" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerName="registry-server" containerID="cri-o://8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657" gracePeriod=2 Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.016676 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.023360 4702 generic.go:334] "Generic (PLEG): container finished" podID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerID="8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657" exitCode=0 Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.023434 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g44ws" event={"ID":"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18","Type":"ContainerDied","Data":"8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657"} Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.023493 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g44ws" event={"ID":"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18","Type":"ContainerDied","Data":"3650711afae3a4263770a6a7995b2b1750703591953c96bef15781c8fb156939"} Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.023515 4702 scope.go:117] "RemoveContainer" containerID="8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.088730 4702 scope.go:117] "RemoveContainer" containerID="6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.144949 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-utilities\") pod \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.145185 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qm6h\" (UniqueName: \"kubernetes.io/projected/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-kube-api-access-9qm6h\") pod \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.145245 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-catalog-content\") pod \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\" (UID: \"5eb7b34c-d8e9-4188-85c1-7be8ec5afa18\") " Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.147054 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-utilities" (OuterVolumeSpecName: "utilities") pod "5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" (UID: "5eb7b34c-d8e9-4188-85c1-7be8ec5afa18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.154353 4702 scope.go:117] "RemoveContainer" containerID="16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.157628 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-kube-api-access-9qm6h" (OuterVolumeSpecName: "kube-api-access-9qm6h") pod "5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" (UID: "5eb7b34c-d8e9-4188-85c1-7be8ec5afa18"). InnerVolumeSpecName "kube-api-access-9qm6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.224315 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" (UID: "5eb7b34c-d8e9-4188-85c1-7be8ec5afa18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.248293 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.248328 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qm6h\" (UniqueName: \"kubernetes.io/projected/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-kube-api-access-9qm6h\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.248340 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.276853 4702 scope.go:117] "RemoveContainer" containerID="8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657" Dec 03 11:40:34 crc kubenswrapper[4702]: E1203 11:40:34.277620 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657\": container with ID starting with 8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657 not found: ID does not exist" containerID="8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.277667 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657"} err="failed to get container status \"8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657\": rpc error: code = NotFound desc = could not find container \"8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657\": container with ID starting with 8b9d8fb21555ff19eb15d50c1ac9a1109be73c7ce5cf6df4ecf1d98fec30f657 not found: ID does not exist" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.277696 4702 scope.go:117] "RemoveContainer" containerID="6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012" Dec 03 11:40:34 crc kubenswrapper[4702]: E1203 11:40:34.281454 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012\": container with ID starting with 6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012 not found: ID does not exist" containerID="6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.281484 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012"} err="failed to get container status \"6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012\": rpc error: code = NotFound desc = could not find container \"6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012\": container with ID starting with 6e699f09c4c54f6d39b332e1b9ce69deed741709d10f80fb30fcddc88fa57012 not found: ID does not exist" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.281506 4702 scope.go:117] "RemoveContainer" containerID="16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be" Dec 03 11:40:34 crc kubenswrapper[4702]: E1203 11:40:34.286370 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be\": container with ID starting with 16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be not found: ID does not exist" containerID="16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be" Dec 03 11:40:34 crc kubenswrapper[4702]: I1203 11:40:34.286440 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be"} err="failed to get container status \"16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be\": rpc error: code = NotFound desc = could not find container \"16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be\": container with ID starting with 16e65501f63f5817e71772f665ef84eb6aa007ed546e43c236aecad31555b0be not found: ID does not exist" Dec 03 11:40:35 crc kubenswrapper[4702]: I1203 11:40:35.038156 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g44ws" Dec 03 11:40:35 crc kubenswrapper[4702]: I1203 11:40:35.069172 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g44ws"] Dec 03 11:40:35 crc kubenswrapper[4702]: I1203 11:40:35.085440 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g44ws"] Dec 03 11:40:36 crc kubenswrapper[4702]: I1203 11:40:36.208703 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-tqs2b"] Dec 03 11:40:36 crc kubenswrapper[4702]: I1203 11:40:36.230064 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-tqs2b"] Dec 03 11:40:36 crc kubenswrapper[4702]: I1203 11:40:36.956329 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20bf2147-401b-457b-ad27-3c893be5fa2c" path="/var/lib/kubelet/pods/20bf2147-401b-457b-ad27-3c893be5fa2c/volumes" Dec 03 11:40:36 crc kubenswrapper[4702]: I1203 11:40:36.957364 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" path="/var/lib/kubelet/pods/5eb7b34c-d8e9-4188-85c1-7be8ec5afa18/volumes" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.628910 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7gh8m"] Dec 03 11:40:37 crc kubenswrapper[4702]: E1203 11:40:37.629779 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerName="extract-utilities" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.629802 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerName="extract-utilities" Dec 03 11:40:37 crc kubenswrapper[4702]: E1203 11:40:37.629822 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerName="extract-content" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.629831 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerName="extract-content" Dec 03 11:40:37 crc kubenswrapper[4702]: E1203 11:40:37.629874 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerName="registry-server" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.629886 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerName="registry-server" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.630338 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5eb7b34c-d8e9-4188-85c1-7be8ec5afa18" containerName="registry-server" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.633295 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.648073 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gh8m"] Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.758011 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-utilities\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.758141 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-catalog-content\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.758372 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9tq5\" (UniqueName: \"kubernetes.io/projected/71382f3d-76c9-42f2-b00b-900954de4257-kube-api-access-m9tq5\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.861002 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9tq5\" (UniqueName: \"kubernetes.io/projected/71382f3d-76c9-42f2-b00b-900954de4257-kube-api-access-m9tq5\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.861251 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-utilities\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.861315 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-catalog-content\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.862142 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-catalog-content\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.862131 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-utilities\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.898949 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9tq5\" (UniqueName: \"kubernetes.io/projected/71382f3d-76c9-42f2-b00b-900954de4257-kube-api-access-m9tq5\") pod \"redhat-marketplace-7gh8m\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:37 crc kubenswrapper[4702]: I1203 11:40:37.960543 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.563805 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gh8m"] Dec 03 11:40:38 crc kubenswrapper[4702]: W1203 11:40:38.567409 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71382f3d_76c9_42f2_b00b_900954de4257.slice/crio-66f6fbb009fcdfe1640727efb04a9be49e5cbe32da993bd5a95e1b4f7b9d13ed WatchSource:0}: Error finding container 66f6fbb009fcdfe1640727efb04a9be49e5cbe32da993bd5a95e1b4f7b9d13ed: Status 404 returned error can't find the container with id 66f6fbb009fcdfe1640727efb04a9be49e5cbe32da993bd5a95e1b4f7b9d13ed Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.647084 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mntnr"] Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.654204 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.662479 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mntnr"] Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.750569 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r6b2\" (UniqueName: \"kubernetes.io/projected/42fd8676-57b9-4350-8ef4-99559467ca2c-kube-api-access-7r6b2\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.750789 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-catalog-content\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.751343 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-utilities\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.852453 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-utilities\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.852573 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r6b2\" (UniqueName: \"kubernetes.io/projected/42fd8676-57b9-4350-8ef4-99559467ca2c-kube-api-access-7r6b2\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.852643 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-catalog-content\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.853346 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-utilities\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.853487 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-catalog-content\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:38 crc kubenswrapper[4702]: I1203 11:40:38.882996 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r6b2\" (UniqueName: \"kubernetes.io/projected/42fd8676-57b9-4350-8ef4-99559467ca2c-kube-api-access-7r6b2\") pod \"community-operators-mntnr\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:39 crc kubenswrapper[4702]: I1203 11:40:39.125916 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:39 crc kubenswrapper[4702]: I1203 11:40:39.302468 4702 generic.go:334] "Generic (PLEG): container finished" podID="71382f3d-76c9-42f2-b00b-900954de4257" containerID="6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893" exitCode=0 Dec 03 11:40:39 crc kubenswrapper[4702]: I1203 11:40:39.302553 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gh8m" event={"ID":"71382f3d-76c9-42f2-b00b-900954de4257","Type":"ContainerDied","Data":"6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893"} Dec 03 11:40:39 crc kubenswrapper[4702]: I1203 11:40:39.302597 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gh8m" event={"ID":"71382f3d-76c9-42f2-b00b-900954de4257","Type":"ContainerStarted","Data":"66f6fbb009fcdfe1640727efb04a9be49e5cbe32da993bd5a95e1b4f7b9d13ed"} Dec 03 11:40:39 crc kubenswrapper[4702]: I1203 11:40:39.985546 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mntnr"] Dec 03 11:40:40 crc kubenswrapper[4702]: I1203 11:40:40.319062 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mntnr" event={"ID":"42fd8676-57b9-4350-8ef4-99559467ca2c","Type":"ContainerStarted","Data":"1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d"} Dec 03 11:40:40 crc kubenswrapper[4702]: I1203 11:40:40.319140 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mntnr" event={"ID":"42fd8676-57b9-4350-8ef4-99559467ca2c","Type":"ContainerStarted","Data":"587218b1eafc815b3689c8fe5b3489d4d420f6d8576b25831145f64baa2baee3"} Dec 03 11:40:41 crc kubenswrapper[4702]: I1203 11:40:41.335455 4702 generic.go:334] "Generic (PLEG): container finished" podID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerID="1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d" exitCode=0 Dec 03 11:40:41 crc kubenswrapper[4702]: I1203 11:40:41.335813 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mntnr" event={"ID":"42fd8676-57b9-4350-8ef4-99559467ca2c","Type":"ContainerDied","Data":"1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d"} Dec 03 11:40:41 crc kubenswrapper[4702]: I1203 11:40:41.341649 4702 generic.go:334] "Generic (PLEG): container finished" podID="71382f3d-76c9-42f2-b00b-900954de4257" containerID="b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9" exitCode=0 Dec 03 11:40:41 crc kubenswrapper[4702]: I1203 11:40:41.341711 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gh8m" event={"ID":"71382f3d-76c9-42f2-b00b-900954de4257","Type":"ContainerDied","Data":"b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9"} Dec 03 11:40:43 crc kubenswrapper[4702]: I1203 11:40:43.384874 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mntnr" event={"ID":"42fd8676-57b9-4350-8ef4-99559467ca2c","Type":"ContainerStarted","Data":"abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d"} Dec 03 11:40:43 crc kubenswrapper[4702]: I1203 11:40:43.401803 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gh8m" event={"ID":"71382f3d-76c9-42f2-b00b-900954de4257","Type":"ContainerStarted","Data":"6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65"} Dec 03 11:40:43 crc kubenswrapper[4702]: I1203 11:40:43.479251 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7gh8m" podStartSLOduration=3.864386884 podStartE2EDuration="6.479221777s" podCreationTimestamp="2025-12-03 11:40:37 +0000 UTC" firstStartedPulling="2025-12-03 11:40:39.314175271 +0000 UTC m=+2223.150103735" lastFinishedPulling="2025-12-03 11:40:41.929010164 +0000 UTC m=+2225.764938628" observedRunningTime="2025-12-03 11:40:43.454775482 +0000 UTC m=+2227.290703966" watchObservedRunningTime="2025-12-03 11:40:43.479221777 +0000 UTC m=+2227.315150241" Dec 03 11:40:47 crc kubenswrapper[4702]: E1203 11:40:47.122980 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42fd8676_57b9_4350_8ef4_99559467ca2c.slice/crio-abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:40:47 crc kubenswrapper[4702]: I1203 11:40:47.140473 4702 generic.go:334] "Generic (PLEG): container finished" podID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerID="abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d" exitCode=0 Dec 03 11:40:47 crc kubenswrapper[4702]: I1203 11:40:47.140525 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mntnr" event={"ID":"42fd8676-57b9-4350-8ef4-99559467ca2c","Type":"ContainerDied","Data":"abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d"} Dec 03 11:40:47 crc kubenswrapper[4702]: I1203 11:40:47.974029 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:47 crc kubenswrapper[4702]: I1203 11:40:47.974570 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:48 crc kubenswrapper[4702]: I1203 11:40:48.036990 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:48 crc kubenswrapper[4702]: E1203 11:40:48.108008 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42fd8676_57b9_4350_8ef4_99559467ca2c.slice/crio-069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:40:48 crc kubenswrapper[4702]: I1203 11:40:48.219418 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:49 crc kubenswrapper[4702]: I1203 11:40:49.168682 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mntnr" event={"ID":"42fd8676-57b9-4350-8ef4-99559467ca2c","Type":"ContainerStarted","Data":"069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207"} Dec 03 11:40:49 crc kubenswrapper[4702]: I1203 11:40:49.200455 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mntnr" podStartSLOduration=4.668994837 podStartE2EDuration="11.200427706s" podCreationTimestamp="2025-12-03 11:40:38 +0000 UTC" firstStartedPulling="2025-12-03 11:40:41.338884145 +0000 UTC m=+2225.174812609" lastFinishedPulling="2025-12-03 11:40:47.870317014 +0000 UTC m=+2231.706245478" observedRunningTime="2025-12-03 11:40:49.189523136 +0000 UTC m=+2233.025451600" watchObservedRunningTime="2025-12-03 11:40:49.200427706 +0000 UTC m=+2233.036356170" Dec 03 11:40:50 crc kubenswrapper[4702]: I1203 11:40:50.024309 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gh8m"] Dec 03 11:40:50 crc kubenswrapper[4702]: I1203 11:40:50.180147 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7gh8m" podUID="71382f3d-76c9-42f2-b00b-900954de4257" containerName="registry-server" containerID="cri-o://6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65" gracePeriod=2 Dec 03 11:40:50 crc kubenswrapper[4702]: I1203 11:40:50.757334 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.127148 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9tq5\" (UniqueName: \"kubernetes.io/projected/71382f3d-76c9-42f2-b00b-900954de4257-kube-api-access-m9tq5\") pod \"71382f3d-76c9-42f2-b00b-900954de4257\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.127354 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-catalog-content\") pod \"71382f3d-76c9-42f2-b00b-900954de4257\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.127409 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-utilities\") pod \"71382f3d-76c9-42f2-b00b-900954de4257\" (UID: \"71382f3d-76c9-42f2-b00b-900954de4257\") " Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.132198 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-utilities" (OuterVolumeSpecName: "utilities") pod "71382f3d-76c9-42f2-b00b-900954de4257" (UID: "71382f3d-76c9-42f2-b00b-900954de4257"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.135702 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.145434 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71382f3d-76c9-42f2-b00b-900954de4257-kube-api-access-m9tq5" (OuterVolumeSpecName: "kube-api-access-m9tq5") pod "71382f3d-76c9-42f2-b00b-900954de4257" (UID: "71382f3d-76c9-42f2-b00b-900954de4257"). InnerVolumeSpecName "kube-api-access-m9tq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.171159 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71382f3d-76c9-42f2-b00b-900954de4257" (UID: "71382f3d-76c9-42f2-b00b-900954de4257"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.205381 4702 generic.go:334] "Generic (PLEG): container finished" podID="71382f3d-76c9-42f2-b00b-900954de4257" containerID="6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65" exitCode=0 Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.205502 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gh8m" event={"ID":"71382f3d-76c9-42f2-b00b-900954de4257","Type":"ContainerDied","Data":"6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65"} Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.205553 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7gh8m" event={"ID":"71382f3d-76c9-42f2-b00b-900954de4257","Type":"ContainerDied","Data":"66f6fbb009fcdfe1640727efb04a9be49e5cbe32da993bd5a95e1b4f7b9d13ed"} Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.205581 4702 scope.go:117] "RemoveContainer" containerID="6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.205991 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7gh8m" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.233718 4702 scope.go:117] "RemoveContainer" containerID="b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.244171 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9tq5\" (UniqueName: \"kubernetes.io/projected/71382f3d-76c9-42f2-b00b-900954de4257-kube-api-access-m9tq5\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.244219 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71382f3d-76c9-42f2-b00b-900954de4257-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.261854 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gh8m"] Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.279816 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7gh8m"] Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.281289 4702 scope.go:117] "RemoveContainer" containerID="6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.339255 4702 scope.go:117] "RemoveContainer" containerID="6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65" Dec 03 11:40:51 crc kubenswrapper[4702]: E1203 11:40:51.342947 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65\": container with ID starting with 6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65 not found: ID does not exist" containerID="6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.342991 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65"} err="failed to get container status \"6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65\": rpc error: code = NotFound desc = could not find container \"6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65\": container with ID starting with 6c7829324c65194ad6c8652fb6d3472667e285efa93f277b2687d3a5db739a65 not found: ID does not exist" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.343019 4702 scope.go:117] "RemoveContainer" containerID="b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9" Dec 03 11:40:51 crc kubenswrapper[4702]: E1203 11:40:51.343554 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9\": container with ID starting with b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9 not found: ID does not exist" containerID="b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.343622 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9"} err="failed to get container status \"b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9\": rpc error: code = NotFound desc = could not find container \"b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9\": container with ID starting with b9947c8d14fb588de2494aa2203b68b64c32846d1d94f93a8ec0be38e4837fb9 not found: ID does not exist" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.343662 4702 scope.go:117] "RemoveContainer" containerID="6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893" Dec 03 11:40:51 crc kubenswrapper[4702]: E1203 11:40:51.344131 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893\": container with ID starting with 6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893 not found: ID does not exist" containerID="6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893" Dec 03 11:40:51 crc kubenswrapper[4702]: I1203 11:40:51.344166 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893"} err="failed to get container status \"6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893\": rpc error: code = NotFound desc = could not find container \"6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893\": container with ID starting with 6d1768e9920158c6c9b2b0c04f524ba1da68098aa6f9031fe04f63d0d46d3893 not found: ID does not exist" Dec 03 11:40:52 crc kubenswrapper[4702]: I1203 11:40:52.946178 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71382f3d-76c9-42f2-b00b-900954de4257" path="/var/lib/kubelet/pods/71382f3d-76c9-42f2-b00b-900954de4257/volumes" Dec 03 11:40:55 crc kubenswrapper[4702]: I1203 11:40:55.907784 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:40:55 crc kubenswrapper[4702]: I1203 11:40:55.909344 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:40:55 crc kubenswrapper[4702]: I1203 11:40:55.909575 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:40:55 crc kubenswrapper[4702]: I1203 11:40:55.910884 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39905d80302a00685e55c3f28d423b218acb74c324103c6641753cc0d7704b72"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:40:55 crc kubenswrapper[4702]: I1203 11:40:55.911024 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://39905d80302a00685e55c3f28d423b218acb74c324103c6641753cc0d7704b72" gracePeriod=600 Dec 03 11:40:56 crc kubenswrapper[4702]: I1203 11:40:56.510275 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="39905d80302a00685e55c3f28d423b218acb74c324103c6641753cc0d7704b72" exitCode=0 Dec 03 11:40:56 crc kubenswrapper[4702]: I1203 11:40:56.510466 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"39905d80302a00685e55c3f28d423b218acb74c324103c6641753cc0d7704b72"} Dec 03 11:40:56 crc kubenswrapper[4702]: I1203 11:40:56.510608 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21"} Dec 03 11:40:56 crc kubenswrapper[4702]: I1203 11:40:56.510638 4702 scope.go:117] "RemoveContainer" containerID="ae1388315359006b4f32603fb41717332ead513f4e98ea33ad44df2ec0abde03" Dec 03 11:40:59 crc kubenswrapper[4702]: I1203 11:40:59.126639 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:59 crc kubenswrapper[4702]: I1203 11:40:59.127290 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:59 crc kubenswrapper[4702]: I1203 11:40:59.181292 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:59 crc kubenswrapper[4702]: I1203 11:40:59.601331 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:40:59 crc kubenswrapper[4702]: I1203 11:40:59.659906 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mntnr"] Dec 03 11:41:01 crc kubenswrapper[4702]: I1203 11:41:01.570567 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mntnr" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerName="registry-server" containerID="cri-o://069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207" gracePeriod=2 Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.215330 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.385874 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r6b2\" (UniqueName: \"kubernetes.io/projected/42fd8676-57b9-4350-8ef4-99559467ca2c-kube-api-access-7r6b2\") pod \"42fd8676-57b9-4350-8ef4-99559467ca2c\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.385965 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-catalog-content\") pod \"42fd8676-57b9-4350-8ef4-99559467ca2c\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.386136 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-utilities\") pod \"42fd8676-57b9-4350-8ef4-99559467ca2c\" (UID: \"42fd8676-57b9-4350-8ef4-99559467ca2c\") " Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.387256 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-utilities" (OuterVolumeSpecName: "utilities") pod "42fd8676-57b9-4350-8ef4-99559467ca2c" (UID: "42fd8676-57b9-4350-8ef4-99559467ca2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.397767 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42fd8676-57b9-4350-8ef4-99559467ca2c-kube-api-access-7r6b2" (OuterVolumeSpecName: "kube-api-access-7r6b2") pod "42fd8676-57b9-4350-8ef4-99559467ca2c" (UID: "42fd8676-57b9-4350-8ef4-99559467ca2c"). InnerVolumeSpecName "kube-api-access-7r6b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.447838 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42fd8676-57b9-4350-8ef4-99559467ca2c" (UID: "42fd8676-57b9-4350-8ef4-99559467ca2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.489353 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.489396 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r6b2\" (UniqueName: \"kubernetes.io/projected/42fd8676-57b9-4350-8ef4-99559467ca2c-kube-api-access-7r6b2\") on node \"crc\" DevicePath \"\"" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.489415 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42fd8676-57b9-4350-8ef4-99559467ca2c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.583602 4702 generic.go:334] "Generic (PLEG): container finished" podID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerID="069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207" exitCode=0 Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.583656 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mntnr" event={"ID":"42fd8676-57b9-4350-8ef4-99559467ca2c","Type":"ContainerDied","Data":"069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207"} Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.583691 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mntnr" event={"ID":"42fd8676-57b9-4350-8ef4-99559467ca2c","Type":"ContainerDied","Data":"587218b1eafc815b3689c8fe5b3489d4d420f6d8576b25831145f64baa2baee3"} Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.583716 4702 scope.go:117] "RemoveContainer" containerID="069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.583970 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mntnr" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.633192 4702 scope.go:117] "RemoveContainer" containerID="abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.642850 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mntnr"] Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.656325 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mntnr"] Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.663011 4702 scope.go:117] "RemoveContainer" containerID="1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.713399 4702 scope.go:117] "RemoveContainer" containerID="069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207" Dec 03 11:41:02 crc kubenswrapper[4702]: E1203 11:41:02.713886 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207\": container with ID starting with 069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207 not found: ID does not exist" containerID="069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.713919 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207"} err="failed to get container status \"069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207\": rpc error: code = NotFound desc = could not find container \"069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207\": container with ID starting with 069fa4b787ef09744a0ffa0f9158eb4fc655023fac45160bc77e9915078d3207 not found: ID does not exist" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.713941 4702 scope.go:117] "RemoveContainer" containerID="abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d" Dec 03 11:41:02 crc kubenswrapper[4702]: E1203 11:41:02.714373 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d\": container with ID starting with abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d not found: ID does not exist" containerID="abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.714400 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d"} err="failed to get container status \"abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d\": rpc error: code = NotFound desc = could not find container \"abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d\": container with ID starting with abee002e01e31a66589ecd25155885a72be8e732fb70747445de60cca6aaee4d not found: ID does not exist" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.714415 4702 scope.go:117] "RemoveContainer" containerID="1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d" Dec 03 11:41:02 crc kubenswrapper[4702]: E1203 11:41:02.714800 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d\": container with ID starting with 1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d not found: ID does not exist" containerID="1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.714826 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d"} err="failed to get container status \"1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d\": rpc error: code = NotFound desc = could not find container \"1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d\": container with ID starting with 1742b5f8ea62331c4d4bf7ef6080835dedc8e7053235cf3e5d9f1af241b1e56d not found: ID does not exist" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.945689 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" path="/var/lib/kubelet/pods/42fd8676-57b9-4350-8ef4-99559467ca2c/volumes" Dec 03 11:41:02 crc kubenswrapper[4702]: I1203 11:41:02.982850 4702 scope.go:117] "RemoveContainer" containerID="f4c7366329e9fbc8693b7c9cda24f6a5889f0b5aaba9acb1380cd579109f0f7f" Dec 03 11:41:03 crc kubenswrapper[4702]: I1203 11:41:03.011193 4702 scope.go:117] "RemoveContainer" containerID="39d9a6c7bef6a87823d2e98dc4833752629526b9bfd845df0aa453d7de14260f" Dec 03 11:41:03 crc kubenswrapper[4702]: I1203 11:41:03.134113 4702 scope.go:117] "RemoveContainer" containerID="5509dd1193f09ef67b047350644f34200351689344ac2325ad460db94c52caf3" Dec 03 11:41:03 crc kubenswrapper[4702]: I1203 11:41:03.239220 4702 scope.go:117] "RemoveContainer" containerID="c68dde3062d8980ea81d1539226a9d5aa4e10038b93c8095bc55487691b6bf57" Dec 03 11:41:44 crc kubenswrapper[4702]: I1203 11:41:44.058270 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ht9tr"] Dec 03 11:41:44 crc kubenswrapper[4702]: I1203 11:41:44.071935 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ht9tr"] Dec 03 11:41:44 crc kubenswrapper[4702]: I1203 11:41:44.946403 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45a1ae25-65af-406f-8c00-4f7b228b2876" path="/var/lib/kubelet/pods/45a1ae25-65af-406f-8c00-4f7b228b2876/volumes" Dec 03 11:41:45 crc kubenswrapper[4702]: I1203 11:41:45.043924 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jt55p"] Dec 03 11:41:45 crc kubenswrapper[4702]: I1203 11:41:45.054873 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-gbrfb"] Dec 03 11:41:45 crc kubenswrapper[4702]: I1203 11:41:45.068532 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-gbrfb"] Dec 03 11:41:45 crc kubenswrapper[4702]: I1203 11:41:45.078737 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jt55p"] Dec 03 11:41:46 crc kubenswrapper[4702]: I1203 11:41:46.042680 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-579b-account-create-update-28tr7"] Dec 03 11:41:46 crc kubenswrapper[4702]: I1203 11:41:46.054010 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-579b-account-create-update-28tr7"] Dec 03 11:41:46 crc kubenswrapper[4702]: I1203 11:41:46.949255 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="610ca85f-6b9d-4275-852e-aa9424ce4066" path="/var/lib/kubelet/pods/610ca85f-6b9d-4275-852e-aa9424ce4066/volumes" Dec 03 11:41:46 crc kubenswrapper[4702]: I1203 11:41:46.951142 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f" path="/var/lib/kubelet/pods/bb65a3fb-d9e8-4c50-98c1-7c33a0442c1f/volumes" Dec 03 11:41:46 crc kubenswrapper[4702]: I1203 11:41:46.955518 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db35177e-78d7-4761-a3c9-cd6bfafce10b" path="/var/lib/kubelet/pods/db35177e-78d7-4761-a3c9-cd6bfafce10b/volumes" Dec 03 11:41:47 crc kubenswrapper[4702]: I1203 11:41:47.047947 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-0687-account-create-update-zw8qt"] Dec 03 11:41:47 crc kubenswrapper[4702]: I1203 11:41:47.063478 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-bf50-account-create-update-8zqck"] Dec 03 11:41:47 crc kubenswrapper[4702]: I1203 11:41:47.073233 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-bf50-account-create-update-8zqck"] Dec 03 11:41:47 crc kubenswrapper[4702]: I1203 11:41:47.082864 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-0687-account-create-update-zw8qt"] Dec 03 11:41:48 crc kubenswrapper[4702]: I1203 11:41:48.945720 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2616bfea-114e-4447-bed8-719c64679287" path="/var/lib/kubelet/pods/2616bfea-114e-4447-bed8-719c64679287/volumes" Dec 03 11:41:48 crc kubenswrapper[4702]: I1203 11:41:48.948025 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c8d995-9bc9-464f-b388-aebbef2024fd" path="/var/lib/kubelet/pods/46c8d995-9bc9-464f-b388-aebbef2024fd/volumes" Dec 03 11:42:03 crc kubenswrapper[4702]: I1203 11:42:03.489090 4702 scope.go:117] "RemoveContainer" containerID="d9f0317e197fd7b4d06bb06090e5cf52a72b0704df67fc3269816ecef8e01ab7" Dec 03 11:42:03 crc kubenswrapper[4702]: I1203 11:42:03.549107 4702 scope.go:117] "RemoveContainer" containerID="2707f5a014f744787676f29c42e6d836b6e70ed12c7407fa7077df56de18d061" Dec 03 11:42:03 crc kubenswrapper[4702]: I1203 11:42:03.599138 4702 scope.go:117] "RemoveContainer" containerID="27ba3fa90071b8ed7866541539dfabec595a888342c39ed65b0729593413d739" Dec 03 11:42:03 crc kubenswrapper[4702]: I1203 11:42:03.650591 4702 scope.go:117] "RemoveContainer" containerID="11330477ddf2a7c05ecc4e4365e5881a85cf6fc3b86296d279f712222fd902cf" Dec 03 11:42:03 crc kubenswrapper[4702]: I1203 11:42:03.706565 4702 scope.go:117] "RemoveContainer" containerID="107ab4c5ef257ec43e883f5e4db934ae5b0a7825608942734485f754d7595c3c" Dec 03 11:42:03 crc kubenswrapper[4702]: I1203 11:42:03.780804 4702 scope.go:117] "RemoveContainer" containerID="125ad852b7afb7358079f8c98a362f30cf5ddc4e950f8cf0220fa283814cbbb2" Dec 03 11:42:40 crc kubenswrapper[4702]: I1203 11:42:40.063805 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-5b4f-account-create-update-x275r"] Dec 03 11:42:40 crc kubenswrapper[4702]: I1203 11:42:40.077406 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-5b4f-account-create-update-x275r"] Dec 03 11:42:40 crc kubenswrapper[4702]: I1203 11:42:40.944302 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18020584-2205-4e9e-a713-79af70a8a84b" path="/var/lib/kubelet/pods/18020584-2205-4e9e-a713-79af70a8a84b/volumes" Dec 03 11:42:42 crc kubenswrapper[4702]: I1203 11:42:42.030304 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-rl959"] Dec 03 11:42:42 crc kubenswrapper[4702]: I1203 11:42:42.046896 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-rl959"] Dec 03 11:42:43 crc kubenswrapper[4702]: I1203 11:42:43.027314 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ad57fb-2b09-47ee-9352-989843fd2b29" path="/var/lib/kubelet/pods/20ad57fb-2b09-47ee-9352-989843fd2b29/volumes" Dec 03 11:42:46 crc kubenswrapper[4702]: I1203 11:42:46.040495 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pn6bt"] Dec 03 11:42:46 crc kubenswrapper[4702]: I1203 11:42:46.056163 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-pn6bt"] Dec 03 11:42:46 crc kubenswrapper[4702]: I1203 11:42:46.956581 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b22aedf-6076-4262-9607-2b26e09f77a0" path="/var/lib/kubelet/pods/0b22aedf-6076-4262-9607-2b26e09f77a0/volumes" Dec 03 11:43:00 crc kubenswrapper[4702]: I1203 11:43:00.374716 4702 generic.go:334] "Generic (PLEG): container finished" podID="e6a36b67-a7da-4684-8b3a-57735f2e4c8d" containerID="c1d719d2c2af4e33cfbf2df5db8f3d7f6ef9af024124d40d82b94a2af9f5ba12" exitCode=0 Dec 03 11:43:00 crc kubenswrapper[4702]: I1203 11:43:00.375252 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" event={"ID":"e6a36b67-a7da-4684-8b3a-57735f2e4c8d","Type":"ContainerDied","Data":"c1d719d2c2af4e33cfbf2df5db8f3d7f6ef9af024124d40d82b94a2af9f5ba12"} Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.048039 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.206637 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-inventory\") pod \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.206997 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-ssh-key\") pod \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.207178 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rbl7\" (UniqueName: \"kubernetes.io/projected/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-kube-api-access-8rbl7\") pod \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\" (UID: \"e6a36b67-a7da-4684-8b3a-57735f2e4c8d\") " Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.214134 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-kube-api-access-8rbl7" (OuterVolumeSpecName: "kube-api-access-8rbl7") pod "e6a36b67-a7da-4684-8b3a-57735f2e4c8d" (UID: "e6a36b67-a7da-4684-8b3a-57735f2e4c8d"). InnerVolumeSpecName "kube-api-access-8rbl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.243483 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e6a36b67-a7da-4684-8b3a-57735f2e4c8d" (UID: "e6a36b67-a7da-4684-8b3a-57735f2e4c8d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.260274 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-inventory" (OuterVolumeSpecName: "inventory") pod "e6a36b67-a7da-4684-8b3a-57735f2e4c8d" (UID: "e6a36b67-a7da-4684-8b3a-57735f2e4c8d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.310519 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.310565 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rbl7\" (UniqueName: \"kubernetes.io/projected/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-kube-api-access-8rbl7\") on node \"crc\" DevicePath \"\"" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.310592 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6a36b67-a7da-4684-8b3a-57735f2e4c8d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.398923 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" event={"ID":"e6a36b67-a7da-4684-8b3a-57735f2e4c8d","Type":"ContainerDied","Data":"09e77122697bac538a52399b41612e938ce474626f06a32f325028b07582c119"} Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.398993 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e77122697bac538a52399b41612e938ce474626f06a32f325028b07582c119" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.398988 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.508185 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl"] Dec 03 11:43:02 crc kubenswrapper[4702]: E1203 11:43:02.509079 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71382f3d-76c9-42f2-b00b-900954de4257" containerName="registry-server" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509124 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="71382f3d-76c9-42f2-b00b-900954de4257" containerName="registry-server" Dec 03 11:43:02 crc kubenswrapper[4702]: E1203 11:43:02.509144 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerName="registry-server" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509153 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerName="registry-server" Dec 03 11:43:02 crc kubenswrapper[4702]: E1203 11:43:02.509179 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71382f3d-76c9-42f2-b00b-900954de4257" containerName="extract-utilities" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509187 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="71382f3d-76c9-42f2-b00b-900954de4257" containerName="extract-utilities" Dec 03 11:43:02 crc kubenswrapper[4702]: E1203 11:43:02.509204 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6a36b67-a7da-4684-8b3a-57735f2e4c8d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509215 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6a36b67-a7da-4684-8b3a-57735f2e4c8d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 11:43:02 crc kubenswrapper[4702]: E1203 11:43:02.509227 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71382f3d-76c9-42f2-b00b-900954de4257" containerName="extract-content" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509235 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="71382f3d-76c9-42f2-b00b-900954de4257" containerName="extract-content" Dec 03 11:43:02 crc kubenswrapper[4702]: E1203 11:43:02.509246 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerName="extract-content" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509254 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerName="extract-content" Dec 03 11:43:02 crc kubenswrapper[4702]: E1203 11:43:02.509265 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerName="extract-utilities" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509273 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerName="extract-utilities" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509604 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="42fd8676-57b9-4350-8ef4-99559467ca2c" containerName="registry-server" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509649 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="71382f3d-76c9-42f2-b00b-900954de4257" containerName="registry-server" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.509673 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6a36b67-a7da-4684-8b3a-57735f2e4c8d" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.510955 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.515390 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.515920 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.516130 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.516305 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.529228 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl"] Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.623110 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7656\" (UniqueName: \"kubernetes.io/projected/b236cac1-567b-4e23-9823-861d30c1793d-kube-api-access-d7656\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.623448 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.623494 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.729496 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7656\" (UniqueName: \"kubernetes.io/projected/b236cac1-567b-4e23-9823-861d30c1793d-kube-api-access-d7656\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.730526 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.730616 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.736696 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.737620 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.755039 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7656\" (UniqueName: \"kubernetes.io/projected/b236cac1-567b-4e23-9823-861d30c1793d-kube-api-access-d7656\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:02 crc kubenswrapper[4702]: I1203 11:43:02.849322 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:43:03 crc kubenswrapper[4702]: I1203 11:43:03.659408 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl"] Dec 03 11:43:03 crc kubenswrapper[4702]: I1203 11:43:03.681406 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:43:03 crc kubenswrapper[4702]: I1203 11:43:03.976566 4702 scope.go:117] "RemoveContainer" containerID="3b2d6d6c143376a96beaf16742b9ed198193bb858bb3383c6e0613a78072e8e9" Dec 03 11:43:04 crc kubenswrapper[4702]: I1203 11:43:04.118594 4702 scope.go:117] "RemoveContainer" containerID="1d06b30e4b03a4b4198c18c1b3af601dd6906046baed9495a5e36caeb4640ef4" Dec 03 11:43:04 crc kubenswrapper[4702]: I1203 11:43:04.222932 4702 scope.go:117] "RemoveContainer" containerID="79e75be8f30d2d00ac399bc3dde03ba15783529297fb25d86097e04ecd0f91d4" Dec 03 11:43:04 crc kubenswrapper[4702]: I1203 11:43:04.440151 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" event={"ID":"b236cac1-567b-4e23-9823-861d30c1793d","Type":"ContainerStarted","Data":"f2e2dd41051ec986946f54d8c32ad73adb788c9bdf37508908a616570df4d9a7"} Dec 03 11:43:05 crc kubenswrapper[4702]: I1203 11:43:05.458820 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" event={"ID":"b236cac1-567b-4e23-9823-861d30c1793d","Type":"ContainerStarted","Data":"13f3e2937127c5ef8a909f73a6e752cbc31fc4265c8f6f661d4181d4b9be6da7"} Dec 03 11:43:05 crc kubenswrapper[4702]: I1203 11:43:05.492458 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" podStartSLOduration=2.8782862099999997 podStartE2EDuration="3.492429673s" podCreationTimestamp="2025-12-03 11:43:02 +0000 UTC" firstStartedPulling="2025-12-03 11:43:03.680916165 +0000 UTC m=+2367.516844629" lastFinishedPulling="2025-12-03 11:43:04.295059628 +0000 UTC m=+2368.130988092" observedRunningTime="2025-12-03 11:43:05.47895298 +0000 UTC m=+2369.314881444" watchObservedRunningTime="2025-12-03 11:43:05.492429673 +0000 UTC m=+2369.328358137" Dec 03 11:43:16 crc kubenswrapper[4702]: I1203 11:43:16.046691 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-dq5nh"] Dec 03 11:43:16 crc kubenswrapper[4702]: I1203 11:43:16.059459 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-dq5nh"] Dec 03 11:43:16 crc kubenswrapper[4702]: I1203 11:43:16.943232 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f" path="/var/lib/kubelet/pods/f58b2d1b-a9b6-4b50-bdf2-98da41b4d26f/volumes" Dec 03 11:43:18 crc kubenswrapper[4702]: I1203 11:43:18.083261 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jzcdc"] Dec 03 11:43:18 crc kubenswrapper[4702]: I1203 11:43:18.108368 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-jzcdc"] Dec 03 11:43:18 crc kubenswrapper[4702]: I1203 11:43:18.943349 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0099aaeb-f11f-492e-8962-e06a72b6878d" path="/var/lib/kubelet/pods/0099aaeb-f11f-492e-8962-e06a72b6878d/volumes" Dec 03 11:43:25 crc kubenswrapper[4702]: I1203 11:43:25.907861 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:43:25 crc kubenswrapper[4702]: I1203 11:43:25.908413 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:43:55 crc kubenswrapper[4702]: I1203 11:43:55.907783 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:43:55 crc kubenswrapper[4702]: I1203 11:43:55.909665 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:44:04 crc kubenswrapper[4702]: I1203 11:44:04.353742 4702 scope.go:117] "RemoveContainer" containerID="ab56ea1883615d9d6d926896a3e7135ece8877e2eb076527bbf64795372caad9" Dec 03 11:44:04 crc kubenswrapper[4702]: I1203 11:44:04.402246 4702 scope.go:117] "RemoveContainer" containerID="382162c87abd8e28697a3509b316313c53439cda66e41bbf8b631b0f466b2893" Dec 03 11:44:11 crc kubenswrapper[4702]: I1203 11:44:11.059204 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-24b4q"] Dec 03 11:44:11 crc kubenswrapper[4702]: I1203 11:44:11.072290 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-24b4q"] Dec 03 11:44:12 crc kubenswrapper[4702]: I1203 11:44:12.950046 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36eb0771-1605-478d-aad3-44f0e6f0932b" path="/var/lib/kubelet/pods/36eb0771-1605-478d-aad3-44f0e6f0932b/volumes" Dec 03 11:44:21 crc kubenswrapper[4702]: I1203 11:44:21.433748 4702 generic.go:334] "Generic (PLEG): container finished" podID="b236cac1-567b-4e23-9823-861d30c1793d" containerID="13f3e2937127c5ef8a909f73a6e752cbc31fc4265c8f6f661d4181d4b9be6da7" exitCode=0 Dec 03 11:44:21 crc kubenswrapper[4702]: I1203 11:44:21.433795 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" event={"ID":"b236cac1-567b-4e23-9823-861d30c1793d","Type":"ContainerDied","Data":"13f3e2937127c5ef8a909f73a6e752cbc31fc4265c8f6f661d4181d4b9be6da7"} Dec 03 11:44:22 crc kubenswrapper[4702]: I1203 11:44:22.983021 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:44:22 crc kubenswrapper[4702]: I1203 11:44:22.993251 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7656\" (UniqueName: \"kubernetes.io/projected/b236cac1-567b-4e23-9823-861d30c1793d-kube-api-access-d7656\") pod \"b236cac1-567b-4e23-9823-861d30c1793d\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " Dec 03 11:44:22 crc kubenswrapper[4702]: I1203 11:44:22.993324 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-ssh-key\") pod \"b236cac1-567b-4e23-9823-861d30c1793d\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " Dec 03 11:44:22 crc kubenswrapper[4702]: I1203 11:44:22.993385 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-inventory\") pod \"b236cac1-567b-4e23-9823-861d30c1793d\" (UID: \"b236cac1-567b-4e23-9823-861d30c1793d\") " Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.001439 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b236cac1-567b-4e23-9823-861d30c1793d-kube-api-access-d7656" (OuterVolumeSpecName: "kube-api-access-d7656") pod "b236cac1-567b-4e23-9823-861d30c1793d" (UID: "b236cac1-567b-4e23-9823-861d30c1793d"). InnerVolumeSpecName "kube-api-access-d7656". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.042028 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-inventory" (OuterVolumeSpecName: "inventory") pod "b236cac1-567b-4e23-9823-861d30c1793d" (UID: "b236cac1-567b-4e23-9823-861d30c1793d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.051400 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b236cac1-567b-4e23-9823-861d30c1793d" (UID: "b236cac1-567b-4e23-9823-861d30c1793d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.096430 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7656\" (UniqueName: \"kubernetes.io/projected/b236cac1-567b-4e23-9823-861d30c1793d-kube-api-access-d7656\") on node \"crc\" DevicePath \"\"" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.096474 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.096485 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b236cac1-567b-4e23-9823-861d30c1793d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.469912 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" event={"ID":"b236cac1-567b-4e23-9823-861d30c1793d","Type":"ContainerDied","Data":"f2e2dd41051ec986946f54d8c32ad73adb788c9bdf37508908a616570df4d9a7"} Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.470276 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2e2dd41051ec986946f54d8c32ad73adb788c9bdf37508908a616570df4d9a7" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.469958 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.630790 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs"] Dec 03 11:44:23 crc kubenswrapper[4702]: E1203 11:44:23.631832 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b236cac1-567b-4e23-9823-861d30c1793d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.631987 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b236cac1-567b-4e23-9823-861d30c1793d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.632473 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="b236cac1-567b-4e23-9823-861d30c1793d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.633735 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.636891 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.642547 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.642551 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.643023 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.646393 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs"] Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.813590 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.823962 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.824030 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kns85\" (UniqueName: \"kubernetes.io/projected/1cb3e758-d316-42dd-97a7-c2fe38a57158-kube-api-access-kns85\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.926880 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.926950 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kns85\" (UniqueName: \"kubernetes.io/projected/1cb3e758-d316-42dd-97a7-c2fe38a57158-kube-api-access-kns85\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.927096 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.933089 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.934899 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.951392 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kns85\" (UniqueName: \"kubernetes.io/projected/1cb3e758-d316-42dd-97a7-c2fe38a57158-kube-api-access-kns85\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:23 crc kubenswrapper[4702]: I1203 11:44:23.955971 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:24 crc kubenswrapper[4702]: I1203 11:44:24.396839 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs"] Dec 03 11:44:24 crc kubenswrapper[4702]: W1203 11:44:24.411226 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb3e758_d316_42dd_97a7_c2fe38a57158.slice/crio-3eb1e4c5457149dc8665b3f4767682e51497a971c064df6eb68efb8301c195ab WatchSource:0}: Error finding container 3eb1e4c5457149dc8665b3f4767682e51497a971c064df6eb68efb8301c195ab: Status 404 returned error can't find the container with id 3eb1e4c5457149dc8665b3f4767682e51497a971c064df6eb68efb8301c195ab Dec 03 11:44:24 crc kubenswrapper[4702]: I1203 11:44:24.483487 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" event={"ID":"1cb3e758-d316-42dd-97a7-c2fe38a57158","Type":"ContainerStarted","Data":"3eb1e4c5457149dc8665b3f4767682e51497a971c064df6eb68efb8301c195ab"} Dec 03 11:44:25 crc kubenswrapper[4702]: I1203 11:44:25.495456 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" event={"ID":"1cb3e758-d316-42dd-97a7-c2fe38a57158","Type":"ContainerStarted","Data":"709ca9aca04acb380f1f3beabe5be144713141837144083b0f34bc98374c9145"} Dec 03 11:44:25 crc kubenswrapper[4702]: I1203 11:44:25.526467 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" podStartSLOduration=1.84328698 podStartE2EDuration="2.526431166s" podCreationTimestamp="2025-12-03 11:44:23 +0000 UTC" firstStartedPulling="2025-12-03 11:44:24.41480818 +0000 UTC m=+2448.250736644" lastFinishedPulling="2025-12-03 11:44:25.097952366 +0000 UTC m=+2448.933880830" observedRunningTime="2025-12-03 11:44:25.508557768 +0000 UTC m=+2449.344486232" watchObservedRunningTime="2025-12-03 11:44:25.526431166 +0000 UTC m=+2449.362359630" Dec 03 11:44:25 crc kubenswrapper[4702]: I1203 11:44:25.908324 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:44:25 crc kubenswrapper[4702]: I1203 11:44:25.908394 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:44:25 crc kubenswrapper[4702]: I1203 11:44:25.908455 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:44:25 crc kubenswrapper[4702]: I1203 11:44:25.909775 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:44:25 crc kubenswrapper[4702]: I1203 11:44:25.909905 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" gracePeriod=600 Dec 03 11:44:26 crc kubenswrapper[4702]: I1203 11:44:26.509504 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" exitCode=0 Dec 03 11:44:26 crc kubenswrapper[4702]: I1203 11:44:26.509608 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21"} Dec 03 11:44:26 crc kubenswrapper[4702]: I1203 11:44:26.509928 4702 scope.go:117] "RemoveContainer" containerID="39905d80302a00685e55c3f28d423b218acb74c324103c6641753cc0d7704b72" Dec 03 11:44:26 crc kubenswrapper[4702]: E1203 11:44:26.559746 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:44:27 crc kubenswrapper[4702]: I1203 11:44:27.522703 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:44:27 crc kubenswrapper[4702]: E1203 11:44:27.523095 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:44:30 crc kubenswrapper[4702]: I1203 11:44:30.557180 4702 generic.go:334] "Generic (PLEG): container finished" podID="1cb3e758-d316-42dd-97a7-c2fe38a57158" containerID="709ca9aca04acb380f1f3beabe5be144713141837144083b0f34bc98374c9145" exitCode=0 Dec 03 11:44:30 crc kubenswrapper[4702]: I1203 11:44:30.557288 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" event={"ID":"1cb3e758-d316-42dd-97a7-c2fe38a57158","Type":"ContainerDied","Data":"709ca9aca04acb380f1f3beabe5be144713141837144083b0f34bc98374c9145"} Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.214929 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.344438 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-inventory\") pod \"1cb3e758-d316-42dd-97a7-c2fe38a57158\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.344742 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-ssh-key\") pod \"1cb3e758-d316-42dd-97a7-c2fe38a57158\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.344900 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kns85\" (UniqueName: \"kubernetes.io/projected/1cb3e758-d316-42dd-97a7-c2fe38a57158-kube-api-access-kns85\") pod \"1cb3e758-d316-42dd-97a7-c2fe38a57158\" (UID: \"1cb3e758-d316-42dd-97a7-c2fe38a57158\") " Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.361592 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb3e758-d316-42dd-97a7-c2fe38a57158-kube-api-access-kns85" (OuterVolumeSpecName: "kube-api-access-kns85") pod "1cb3e758-d316-42dd-97a7-c2fe38a57158" (UID: "1cb3e758-d316-42dd-97a7-c2fe38a57158"). InnerVolumeSpecName "kube-api-access-kns85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.419909 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-inventory" (OuterVolumeSpecName: "inventory") pod "1cb3e758-d316-42dd-97a7-c2fe38a57158" (UID: "1cb3e758-d316-42dd-97a7-c2fe38a57158"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.439878 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1cb3e758-d316-42dd-97a7-c2fe38a57158" (UID: "1cb3e758-d316-42dd-97a7-c2fe38a57158"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.452828 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.452858 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kns85\" (UniqueName: \"kubernetes.io/projected/1cb3e758-d316-42dd-97a7-c2fe38a57158-kube-api-access-kns85\") on node \"crc\" DevicePath \"\"" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.452869 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1cb3e758-d316-42dd-97a7-c2fe38a57158-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.579827 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" event={"ID":"1cb3e758-d316-42dd-97a7-c2fe38a57158","Type":"ContainerDied","Data":"3eb1e4c5457149dc8665b3f4767682e51497a971c064df6eb68efb8301c195ab"} Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.579876 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eb1e4c5457149dc8665b3f4767682e51497a971c064df6eb68efb8301c195ab" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.579884 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.673917 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5"] Dec 03 11:44:32 crc kubenswrapper[4702]: E1203 11:44:32.675030 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb3e758-d316-42dd-97a7-c2fe38a57158" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.675138 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb3e758-d316-42dd-97a7-c2fe38a57158" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.675605 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb3e758-d316-42dd-97a7-c2fe38a57158" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.676928 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.679172 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.680081 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.680257 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.685467 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.687168 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5"] Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.861014 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.861135 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.861255 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w96xs\" (UniqueName: \"kubernetes.io/projected/6c2b0167-387a-48e4-9931-1869990ede5e-kube-api-access-w96xs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.963642 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.963820 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w96xs\" (UniqueName: \"kubernetes.io/projected/6c2b0167-387a-48e4-9931-1869990ede5e-kube-api-access-w96xs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.963901 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.968799 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.969540 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:32 crc kubenswrapper[4702]: I1203 11:44:32.990899 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w96xs\" (UniqueName: \"kubernetes.io/projected/6c2b0167-387a-48e4-9931-1869990ede5e-kube-api-access-w96xs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fchg5\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:33 crc kubenswrapper[4702]: I1203 11:44:33.010725 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:44:33 crc kubenswrapper[4702]: I1203 11:44:33.552168 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5"] Dec 03 11:44:33 crc kubenswrapper[4702]: I1203 11:44:33.592727 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" event={"ID":"6c2b0167-387a-48e4-9931-1869990ede5e","Type":"ContainerStarted","Data":"86d0979388411c806b4378eeb7a3cfdd7e81e35233a103ae8df7ad9d56f92076"} Dec 03 11:44:35 crc kubenswrapper[4702]: I1203 11:44:35.618364 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" event={"ID":"6c2b0167-387a-48e4-9931-1869990ede5e","Type":"ContainerStarted","Data":"3395348823ba68dbefca89972035a0021b003133c62c4f88029bca0bcb2d39e4"} Dec 03 11:44:35 crc kubenswrapper[4702]: I1203 11:44:35.636068 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" podStartSLOduration=3.042306752 podStartE2EDuration="3.636047514s" podCreationTimestamp="2025-12-03 11:44:32 +0000 UTC" firstStartedPulling="2025-12-03 11:44:33.561422101 +0000 UTC m=+2457.397350565" lastFinishedPulling="2025-12-03 11:44:34.155162863 +0000 UTC m=+2457.991091327" observedRunningTime="2025-12-03 11:44:35.6355793 +0000 UTC m=+2459.471507764" watchObservedRunningTime="2025-12-03 11:44:35.636047514 +0000 UTC m=+2459.471975978" Dec 03 11:44:38 crc kubenswrapper[4702]: I1203 11:44:38.928539 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:44:38 crc kubenswrapper[4702]: E1203 11:44:38.929392 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:44:53 crc kubenswrapper[4702]: I1203 11:44:53.928716 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:44:53 crc kubenswrapper[4702]: E1203 11:44:53.929737 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.222807 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5"] Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.225375 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.227671 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.232007 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.248183 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5"] Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.380454 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dc69dd-f84f-456f-b7af-6289897508f9-config-volume\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.380533 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dc69dd-f84f-456f-b7af-6289897508f9-secret-volume\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.381840 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp8qn\" (UniqueName: \"kubernetes.io/projected/10dc69dd-f84f-456f-b7af-6289897508f9-kube-api-access-tp8qn\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.485697 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dc69dd-f84f-456f-b7af-6289897508f9-config-volume\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.485846 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dc69dd-f84f-456f-b7af-6289897508f9-secret-volume\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.486055 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp8qn\" (UniqueName: \"kubernetes.io/projected/10dc69dd-f84f-456f-b7af-6289897508f9-kube-api-access-tp8qn\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.486643 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dc69dd-f84f-456f-b7af-6289897508f9-config-volume\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.501407 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dc69dd-f84f-456f-b7af-6289897508f9-secret-volume\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.508597 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp8qn\" (UniqueName: \"kubernetes.io/projected/10dc69dd-f84f-456f-b7af-6289897508f9-kube-api-access-tp8qn\") pod \"collect-profiles-29412705-zsdl5\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:00 crc kubenswrapper[4702]: I1203 11:45:00.556971 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:01 crc kubenswrapper[4702]: I1203 11:45:01.267003 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5"] Dec 03 11:45:02 crc kubenswrapper[4702]: I1203 11:45:02.249063 4702 generic.go:334] "Generic (PLEG): container finished" podID="10dc69dd-f84f-456f-b7af-6289897508f9" containerID="6eeca76c4f6234cc177b88036a61469bd0a0f23a018c94775739582c9d86e78d" exitCode=0 Dec 03 11:45:02 crc kubenswrapper[4702]: I1203 11:45:02.249197 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" event={"ID":"10dc69dd-f84f-456f-b7af-6289897508f9","Type":"ContainerDied","Data":"6eeca76c4f6234cc177b88036a61469bd0a0f23a018c94775739582c9d86e78d"} Dec 03 11:45:02 crc kubenswrapper[4702]: I1203 11:45:02.249415 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" event={"ID":"10dc69dd-f84f-456f-b7af-6289897508f9","Type":"ContainerStarted","Data":"0dc388a33d7508b527edcbfc75e2f65f779e2c54de9f3777699795cf331b7eae"} Dec 03 11:45:03 crc kubenswrapper[4702]: I1203 11:45:03.845694 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.035134 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp8qn\" (UniqueName: \"kubernetes.io/projected/10dc69dd-f84f-456f-b7af-6289897508f9-kube-api-access-tp8qn\") pod \"10dc69dd-f84f-456f-b7af-6289897508f9\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.035404 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dc69dd-f84f-456f-b7af-6289897508f9-secret-volume\") pod \"10dc69dd-f84f-456f-b7af-6289897508f9\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.035772 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dc69dd-f84f-456f-b7af-6289897508f9-config-volume\") pod \"10dc69dd-f84f-456f-b7af-6289897508f9\" (UID: \"10dc69dd-f84f-456f-b7af-6289897508f9\") " Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.036891 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10dc69dd-f84f-456f-b7af-6289897508f9-config-volume" (OuterVolumeSpecName: "config-volume") pod "10dc69dd-f84f-456f-b7af-6289897508f9" (UID: "10dc69dd-f84f-456f-b7af-6289897508f9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.246680 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10dc69dd-f84f-456f-b7af-6289897508f9-kube-api-access-tp8qn" (OuterVolumeSpecName: "kube-api-access-tp8qn") pod "10dc69dd-f84f-456f-b7af-6289897508f9" (UID: "10dc69dd-f84f-456f-b7af-6289897508f9"). InnerVolumeSpecName "kube-api-access-tp8qn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.260982 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/10dc69dd-f84f-456f-b7af-6289897508f9-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.261038 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp8qn\" (UniqueName: \"kubernetes.io/projected/10dc69dd-f84f-456f-b7af-6289897508f9-kube-api-access-tp8qn\") on node \"crc\" DevicePath \"\"" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.268162 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10dc69dd-f84f-456f-b7af-6289897508f9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "10dc69dd-f84f-456f-b7af-6289897508f9" (UID: "10dc69dd-f84f-456f-b7af-6289897508f9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.277333 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" event={"ID":"10dc69dd-f84f-456f-b7af-6289897508f9","Type":"ContainerDied","Data":"0dc388a33d7508b527edcbfc75e2f65f779e2c54de9f3777699795cf331b7eae"} Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.277403 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dc388a33d7508b527edcbfc75e2f65f779e2c54de9f3777699795cf331b7eae" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.277515 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.363498 4702 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/10dc69dd-f84f-456f-b7af-6289897508f9-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 11:45:04 crc kubenswrapper[4702]: I1203 11:45:04.509955 4702 scope.go:117] "RemoveContainer" containerID="03a377a86e254cfdced08b0b9895fb8194a6a6b0393febbb1ba40e496277b4d3" Dec 03 11:45:05 crc kubenswrapper[4702]: I1203 11:45:05.356499 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n"] Dec 03 11:45:05 crc kubenswrapper[4702]: I1203 11:45:05.370307 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412660-4lj4n"] Dec 03 11:45:05 crc kubenswrapper[4702]: I1203 11:45:05.929070 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:45:05 crc kubenswrapper[4702]: E1203 11:45:05.929586 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:45:06 crc kubenswrapper[4702]: I1203 11:45:06.945368 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77a69d25-2384-466b-b284-e36e979597b4" path="/var/lib/kubelet/pods/77a69d25-2384-466b-b284-e36e979597b4/volumes" Dec 03 11:45:16 crc kubenswrapper[4702]: I1203 11:45:16.485461 4702 generic.go:334] "Generic (PLEG): container finished" podID="6c2b0167-387a-48e4-9931-1869990ede5e" containerID="3395348823ba68dbefca89972035a0021b003133c62c4f88029bca0bcb2d39e4" exitCode=0 Dec 03 11:45:16 crc kubenswrapper[4702]: I1203 11:45:16.485579 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" event={"ID":"6c2b0167-387a-48e4-9931-1869990ede5e","Type":"ContainerDied","Data":"3395348823ba68dbefca89972035a0021b003133c62c4f88029bca0bcb2d39e4"} Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.493540 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.512618 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" event={"ID":"6c2b0167-387a-48e4-9931-1869990ede5e","Type":"ContainerDied","Data":"86d0979388411c806b4378eeb7a3cfdd7e81e35233a103ae8df7ad9d56f92076"} Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.512721 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86d0979388411c806b4378eeb7a3cfdd7e81e35233a103ae8df7ad9d56f92076" Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.512875 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fchg5" Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.620626 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz"] Dec 03 11:45:18 crc kubenswrapper[4702]: E1203 11:45:18.621590 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10dc69dd-f84f-456f-b7af-6289897508f9" containerName="collect-profiles" Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.621623 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="10dc69dd-f84f-456f-b7af-6289897508f9" containerName="collect-profiles" Dec 03 11:45:18 crc kubenswrapper[4702]: E1203 11:45:18.621661 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2b0167-387a-48e4-9931-1869990ede5e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.621671 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2b0167-387a-48e4-9931-1869990ede5e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.621946 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="10dc69dd-f84f-456f-b7af-6289897508f9" containerName="collect-profiles" Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.621981 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2b0167-387a-48e4-9931-1869990ede5e" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:45:18 crc kubenswrapper[4702]: I1203 11:45:18.623198 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.007396 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz"] Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.010085 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w96xs\" (UniqueName: \"kubernetes.io/projected/6c2b0167-387a-48e4-9931-1869990ede5e-kube-api-access-w96xs\") pod \"6c2b0167-387a-48e4-9931-1869990ede5e\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.010265 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-ssh-key\") pod \"6c2b0167-387a-48e4-9931-1869990ede5e\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.010316 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-inventory\") pod \"6c2b0167-387a-48e4-9931-1869990ede5e\" (UID: \"6c2b0167-387a-48e4-9931-1869990ede5e\") " Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.045460 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2b0167-387a-48e4-9931-1869990ede5e-kube-api-access-w96xs" (OuterVolumeSpecName: "kube-api-access-w96xs") pod "6c2b0167-387a-48e4-9931-1869990ede5e" (UID: "6c2b0167-387a-48e4-9931-1869990ede5e"). InnerVolumeSpecName "kube-api-access-w96xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.116745 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.117163 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbrzt\" (UniqueName: \"kubernetes.io/projected/961e241a-3dbc-4c96-afcd-6b768d0322db-kube-api-access-xbrzt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.117378 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.118457 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w96xs\" (UniqueName: \"kubernetes.io/projected/6c2b0167-387a-48e4-9931-1869990ede5e-kube-api-access-w96xs\") on node \"crc\" DevicePath \"\"" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.125527 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-inventory" (OuterVolumeSpecName: "inventory") pod "6c2b0167-387a-48e4-9931-1869990ede5e" (UID: "6c2b0167-387a-48e4-9931-1869990ede5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.149420 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6c2b0167-387a-48e4-9931-1869990ede5e" (UID: "6c2b0167-387a-48e4-9931-1869990ede5e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.220895 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.220988 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbrzt\" (UniqueName: \"kubernetes.io/projected/961e241a-3dbc-4c96-afcd-6b768d0322db-kube-api-access-xbrzt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.221052 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.221340 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.221368 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6c2b0167-387a-48e4-9931-1869990ede5e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.224732 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.225181 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.239827 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbrzt\" (UniqueName: \"kubernetes.io/projected/961e241a-3dbc-4c96-afcd-6b768d0322db-kube-api-access-xbrzt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-smqtz\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.303544 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:45:19 crc kubenswrapper[4702]: I1203 11:45:19.928734 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:45:19 crc kubenswrapper[4702]: E1203 11:45:19.930629 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:45:20 crc kubenswrapper[4702]: I1203 11:45:20.177092 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz"] Dec 03 11:45:20 crc kubenswrapper[4702]: I1203 11:45:20.844376 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" event={"ID":"961e241a-3dbc-4c96-afcd-6b768d0322db","Type":"ContainerStarted","Data":"e8fb89faff19c63491b6af716974302aa5c8abef5d7bc2854e257ba01c399219"} Dec 03 11:45:21 crc kubenswrapper[4702]: I1203 11:45:21.865012 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" event={"ID":"961e241a-3dbc-4c96-afcd-6b768d0322db","Type":"ContainerStarted","Data":"e244c497c83bb7545be43bcc28d4e2d02428b2b3f0d59b35380d5d8f9940d070"} Dec 03 11:45:21 crc kubenswrapper[4702]: I1203 11:45:21.887986 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" podStartSLOduration=2.876667117 podStartE2EDuration="3.887957219s" podCreationTimestamp="2025-12-03 11:45:18 +0000 UTC" firstStartedPulling="2025-12-03 11:45:20.183216159 +0000 UTC m=+2504.019144623" lastFinishedPulling="2025-12-03 11:45:21.194506251 +0000 UTC m=+2505.030434725" observedRunningTime="2025-12-03 11:45:21.884228103 +0000 UTC m=+2505.720156567" watchObservedRunningTime="2025-12-03 11:45:21.887957219 +0000 UTC m=+2505.723885673" Dec 03 11:45:32 crc kubenswrapper[4702]: I1203 11:45:32.928910 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:45:32 crc kubenswrapper[4702]: E1203 11:45:32.929741 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:45:37 crc kubenswrapper[4702]: I1203 11:45:37.050348 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-gq8tr"] Dec 03 11:45:37 crc kubenswrapper[4702]: I1203 11:45:37.063381 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-gq8tr"] Dec 03 11:45:38 crc kubenswrapper[4702]: I1203 11:45:38.944145 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cf7ffe2-0a74-42cd-ab56-1a65d1317f54" path="/var/lib/kubelet/pods/8cf7ffe2-0a74-42cd-ab56-1a65d1317f54/volumes" Dec 03 11:45:45 crc kubenswrapper[4702]: I1203 11:45:45.153648 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:45:45 crc kubenswrapper[4702]: E1203 11:45:45.157835 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:45:56 crc kubenswrapper[4702]: I1203 11:45:56.936588 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:45:56 crc kubenswrapper[4702]: E1203 11:45:56.937475 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:46:04 crc kubenswrapper[4702]: I1203 11:46:04.591248 4702 scope.go:117] "RemoveContainer" containerID="b1abdb7d1dde37410aaaab35ec091adbaa25c05e540afd9716f5f7c13ddcd219" Dec 03 11:46:04 crc kubenswrapper[4702]: I1203 11:46:04.636842 4702 scope.go:117] "RemoveContainer" containerID="f504e6a99f3fd97afe3603b3b2ca8f616014af94e2da3e3beb060139b03aaf1c" Dec 03 11:46:08 crc kubenswrapper[4702]: I1203 11:46:08.929077 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:46:08 crc kubenswrapper[4702]: E1203 11:46:08.930201 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:46:14 crc kubenswrapper[4702]: I1203 11:46:14.647123 4702 generic.go:334] "Generic (PLEG): container finished" podID="961e241a-3dbc-4c96-afcd-6b768d0322db" containerID="e244c497c83bb7545be43bcc28d4e2d02428b2b3f0d59b35380d5d8f9940d070" exitCode=0 Dec 03 11:46:14 crc kubenswrapper[4702]: I1203 11:46:14.647209 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" event={"ID":"961e241a-3dbc-4c96-afcd-6b768d0322db","Type":"ContainerDied","Data":"e244c497c83bb7545be43bcc28d4e2d02428b2b3f0d59b35380d5d8f9940d070"} Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.245268 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.267612 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-ssh-key\") pod \"961e241a-3dbc-4c96-afcd-6b768d0322db\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.268053 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbrzt\" (UniqueName: \"kubernetes.io/projected/961e241a-3dbc-4c96-afcd-6b768d0322db-kube-api-access-xbrzt\") pod \"961e241a-3dbc-4c96-afcd-6b768d0322db\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.268159 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-inventory\") pod \"961e241a-3dbc-4c96-afcd-6b768d0322db\" (UID: \"961e241a-3dbc-4c96-afcd-6b768d0322db\") " Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.280035 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/961e241a-3dbc-4c96-afcd-6b768d0322db-kube-api-access-xbrzt" (OuterVolumeSpecName: "kube-api-access-xbrzt") pod "961e241a-3dbc-4c96-afcd-6b768d0322db" (UID: "961e241a-3dbc-4c96-afcd-6b768d0322db"). InnerVolumeSpecName "kube-api-access-xbrzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.335146 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-inventory" (OuterVolumeSpecName: "inventory") pod "961e241a-3dbc-4c96-afcd-6b768d0322db" (UID: "961e241a-3dbc-4c96-afcd-6b768d0322db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.335214 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "961e241a-3dbc-4c96-afcd-6b768d0322db" (UID: "961e241a-3dbc-4c96-afcd-6b768d0322db"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.370392 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.370425 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbrzt\" (UniqueName: \"kubernetes.io/projected/961e241a-3dbc-4c96-afcd-6b768d0322db-kube-api-access-xbrzt\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.370436 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/961e241a-3dbc-4c96-afcd-6b768d0322db-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.678977 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" event={"ID":"961e241a-3dbc-4c96-afcd-6b768d0322db","Type":"ContainerDied","Data":"e8fb89faff19c63491b6af716974302aa5c8abef5d7bc2854e257ba01c399219"} Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.679085 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8fb89faff19c63491b6af716974302aa5c8abef5d7bc2854e257ba01c399219" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.679025 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-smqtz" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.769794 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-l4zf4"] Dec 03 11:46:16 crc kubenswrapper[4702]: E1203 11:46:16.770678 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="961e241a-3dbc-4c96-afcd-6b768d0322db" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.770698 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="961e241a-3dbc-4c96-afcd-6b768d0322db" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.770950 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="961e241a-3dbc-4c96-afcd-6b768d0322db" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.772246 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.776312 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.776497 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.776624 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.776801 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.790087 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-l4zf4"] Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.881796 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.881901 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.881967 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8bsl\" (UniqueName: \"kubernetes.io/projected/4b86b10a-a947-4b55-b723-5542cd398eaf-kube-api-access-j8bsl\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.985102 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.985263 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.985360 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8bsl\" (UniqueName: \"kubernetes.io/projected/4b86b10a-a947-4b55-b723-5542cd398eaf-kube-api-access-j8bsl\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.991105 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:16 crc kubenswrapper[4702]: I1203 11:46:16.996832 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:17 crc kubenswrapper[4702]: I1203 11:46:17.003259 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8bsl\" (UniqueName: \"kubernetes.io/projected/4b86b10a-a947-4b55-b723-5542cd398eaf-kube-api-access-j8bsl\") pod \"ssh-known-hosts-edpm-deployment-l4zf4\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:17 crc kubenswrapper[4702]: I1203 11:46:17.136529 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:17 crc kubenswrapper[4702]: I1203 11:46:17.875337 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-l4zf4"] Dec 03 11:46:18 crc kubenswrapper[4702]: I1203 11:46:18.707357 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" event={"ID":"4b86b10a-a947-4b55-b723-5542cd398eaf","Type":"ContainerStarted","Data":"81a5635835599af6716fed8760274f1cd9c8f8fe2786edd4b227868c2058febb"} Dec 03 11:46:20 crc kubenswrapper[4702]: I1203 11:46:20.736086 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" event={"ID":"4b86b10a-a947-4b55-b723-5542cd398eaf","Type":"ContainerStarted","Data":"64644b832a7152680e4b0d1396da7aee9e2b5c3ee71b968f1d3109e75305fbc7"} Dec 03 11:46:20 crc kubenswrapper[4702]: I1203 11:46:20.776661 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" podStartSLOduration=2.906926126 podStartE2EDuration="4.776612608s" podCreationTimestamp="2025-12-03 11:46:16 +0000 UTC" firstStartedPulling="2025-12-03 11:46:17.883084847 +0000 UTC m=+2561.719013311" lastFinishedPulling="2025-12-03 11:46:19.752771329 +0000 UTC m=+2563.588699793" observedRunningTime="2025-12-03 11:46:20.754251641 +0000 UTC m=+2564.590180115" watchObservedRunningTime="2025-12-03 11:46:20.776612608 +0000 UTC m=+2564.612541082" Dec 03 11:46:22 crc kubenswrapper[4702]: I1203 11:46:22.045639 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-6tgbz"] Dec 03 11:46:22 crc kubenswrapper[4702]: I1203 11:46:22.056788 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-6tgbz"] Dec 03 11:46:22 crc kubenswrapper[4702]: I1203 11:46:22.999572 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82a7b79c-25b3-4bd5-ab40-d9654c454997" path="/var/lib/kubelet/pods/82a7b79c-25b3-4bd5-ab40-d9654c454997/volumes" Dec 03 11:46:23 crc kubenswrapper[4702]: I1203 11:46:23.928900 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:46:23 crc kubenswrapper[4702]: E1203 11:46:23.929289 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:46:27 crc kubenswrapper[4702]: I1203 11:46:27.847268 4702 generic.go:334] "Generic (PLEG): container finished" podID="4b86b10a-a947-4b55-b723-5542cd398eaf" containerID="64644b832a7152680e4b0d1396da7aee9e2b5c3ee71b968f1d3109e75305fbc7" exitCode=0 Dec 03 11:46:27 crc kubenswrapper[4702]: I1203 11:46:27.847335 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" event={"ID":"4b86b10a-a947-4b55-b723-5542cd398eaf","Type":"ContainerDied","Data":"64644b832a7152680e4b0d1396da7aee9e2b5c3ee71b968f1d3109e75305fbc7"} Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.369855 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.478186 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-ssh-key-openstack-edpm-ipam\") pod \"4b86b10a-a947-4b55-b723-5542cd398eaf\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.478488 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8bsl\" (UniqueName: \"kubernetes.io/projected/4b86b10a-a947-4b55-b723-5542cd398eaf-kube-api-access-j8bsl\") pod \"4b86b10a-a947-4b55-b723-5542cd398eaf\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.479359 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-inventory-0\") pod \"4b86b10a-a947-4b55-b723-5542cd398eaf\" (UID: \"4b86b10a-a947-4b55-b723-5542cd398eaf\") " Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.484240 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b86b10a-a947-4b55-b723-5542cd398eaf-kube-api-access-j8bsl" (OuterVolumeSpecName: "kube-api-access-j8bsl") pod "4b86b10a-a947-4b55-b723-5542cd398eaf" (UID: "4b86b10a-a947-4b55-b723-5542cd398eaf"). InnerVolumeSpecName "kube-api-access-j8bsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.520374 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4b86b10a-a947-4b55-b723-5542cd398eaf" (UID: "4b86b10a-a947-4b55-b723-5542cd398eaf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.521644 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4b86b10a-a947-4b55-b723-5542cd398eaf" (UID: "4b86b10a-a947-4b55-b723-5542cd398eaf"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.582710 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.582778 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8bsl\" (UniqueName: \"kubernetes.io/projected/4b86b10a-a947-4b55-b723-5542cd398eaf-kube-api-access-j8bsl\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.582797 4702 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4b86b10a-a947-4b55-b723-5542cd398eaf-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.878251 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" event={"ID":"4b86b10a-a947-4b55-b723-5542cd398eaf","Type":"ContainerDied","Data":"81a5635835599af6716fed8760274f1cd9c8f8fe2786edd4b227868c2058febb"} Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.878323 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81a5635835599af6716fed8760274f1cd9c8f8fe2786edd4b227868c2058febb" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.878359 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-l4zf4" Dec 03 11:46:29 crc kubenswrapper[4702]: I1203 11:46:29.999906 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29"] Dec 03 11:46:30 crc kubenswrapper[4702]: E1203 11:46:30.000617 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b86b10a-a947-4b55-b723-5542cd398eaf" containerName="ssh-known-hosts-edpm-deployment" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.000643 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b86b10a-a947-4b55-b723-5542cd398eaf" containerName="ssh-known-hosts-edpm-deployment" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.000935 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b86b10a-a947-4b55-b723-5542cd398eaf" containerName="ssh-known-hosts-edpm-deployment" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.002073 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.004145 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.004607 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.004710 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.004753 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.013784 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29"] Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.043018 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.043160 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.043868 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxnph\" (UniqueName: \"kubernetes.io/projected/a34690c3-fe9d-4dac-841c-07298c80c0e8-kube-api-access-kxnph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.144996 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.145186 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxnph\" (UniqueName: \"kubernetes.io/projected/a34690c3-fe9d-4dac-841c-07298c80c0e8-kube-api-access-kxnph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.145301 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.150328 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.154835 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.176362 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxnph\" (UniqueName: \"kubernetes.io/projected/a34690c3-fe9d-4dac-841c-07298c80c0e8-kube-api-access-kxnph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-k9m29\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.327019 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:30 crc kubenswrapper[4702]: I1203 11:46:30.960133 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29"] Dec 03 11:46:31 crc kubenswrapper[4702]: I1203 11:46:31.898889 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" event={"ID":"a34690c3-fe9d-4dac-841c-07298c80c0e8","Type":"ContainerStarted","Data":"b3222c45cd140d66632335548cfaf0c9b7ad69ddd5af84927930cfff145f0fe2"} Dec 03 11:46:32 crc kubenswrapper[4702]: I1203 11:46:32.917213 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" event={"ID":"a34690c3-fe9d-4dac-841c-07298c80c0e8","Type":"ContainerStarted","Data":"5ff2c0c2b35f6ce52d8b84466e42d07ddd0276fd880fa034600a1542e96f018f"} Dec 03 11:46:32 crc kubenswrapper[4702]: I1203 11:46:32.949149 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" podStartSLOduration=2.8686180439999998 podStartE2EDuration="3.949125025s" podCreationTimestamp="2025-12-03 11:46:29 +0000 UTC" firstStartedPulling="2025-12-03 11:46:31.009276406 +0000 UTC m=+2574.845204870" lastFinishedPulling="2025-12-03 11:46:32.089783387 +0000 UTC m=+2575.925711851" observedRunningTime="2025-12-03 11:46:32.937844314 +0000 UTC m=+2576.773772778" watchObservedRunningTime="2025-12-03 11:46:32.949125025 +0000 UTC m=+2576.785053489" Dec 03 11:46:36 crc kubenswrapper[4702]: I1203 11:46:36.940867 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:46:36 crc kubenswrapper[4702]: E1203 11:46:36.941614 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:46:42 crc kubenswrapper[4702]: I1203 11:46:42.043424 4702 generic.go:334] "Generic (PLEG): container finished" podID="a34690c3-fe9d-4dac-841c-07298c80c0e8" containerID="5ff2c0c2b35f6ce52d8b84466e42d07ddd0276fd880fa034600a1542e96f018f" exitCode=0 Dec 03 11:46:42 crc kubenswrapper[4702]: I1203 11:46:42.043487 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" event={"ID":"a34690c3-fe9d-4dac-841c-07298c80c0e8","Type":"ContainerDied","Data":"5ff2c0c2b35f6ce52d8b84466e42d07ddd0276fd880fa034600a1542e96f018f"} Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.609959 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.710264 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k2b7f"] Dec 03 11:46:48 crc kubenswrapper[4702]: E1203 11:46:43.710880 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a34690c3-fe9d-4dac-841c-07298c80c0e8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.710897 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a34690c3-fe9d-4dac-841c-07298c80c0e8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.711203 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a34690c3-fe9d-4dac-841c-07298c80c0e8" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.713125 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.727425 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2b7f"] Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.809379 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-ssh-key\") pod \"a34690c3-fe9d-4dac-841c-07298c80c0e8\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.809914 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-inventory\") pod \"a34690c3-fe9d-4dac-841c-07298c80c0e8\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.810006 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxnph\" (UniqueName: \"kubernetes.io/projected/a34690c3-fe9d-4dac-841c-07298c80c0e8-kube-api-access-kxnph\") pod \"a34690c3-fe9d-4dac-841c-07298c80c0e8\" (UID: \"a34690c3-fe9d-4dac-841c-07298c80c0e8\") " Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.810406 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-catalog-content\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.810522 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmz4s\" (UniqueName: \"kubernetes.io/projected/51782d60-42d3-47e2-aadb-dee3c07d518c-kube-api-access-lmz4s\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.810548 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-utilities\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.814624 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a34690c3-fe9d-4dac-841c-07298c80c0e8-kube-api-access-kxnph" (OuterVolumeSpecName: "kube-api-access-kxnph") pod "a34690c3-fe9d-4dac-841c-07298c80c0e8" (UID: "a34690c3-fe9d-4dac-841c-07298c80c0e8"). InnerVolumeSpecName "kube-api-access-kxnph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.845527 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-inventory" (OuterVolumeSpecName: "inventory") pod "a34690c3-fe9d-4dac-841c-07298c80c0e8" (UID: "a34690c3-fe9d-4dac-841c-07298c80c0e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.847690 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a34690c3-fe9d-4dac-841c-07298c80c0e8" (UID: "a34690c3-fe9d-4dac-841c-07298c80c0e8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.912517 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-catalog-content\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.912632 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmz4s\" (UniqueName: \"kubernetes.io/projected/51782d60-42d3-47e2-aadb-dee3c07d518c-kube-api-access-lmz4s\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.912655 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-utilities\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.913267 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kxnph\" (UniqueName: \"kubernetes.io/projected/a34690c3-fe9d-4dac-841c-07298c80c0e8-kube-api-access-kxnph\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.913356 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.913389 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a34690c3-fe9d-4dac-841c-07298c80c0e8-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.913537 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-catalog-content\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.913829 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-utilities\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:43.932265 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmz4s\" (UniqueName: \"kubernetes.io/projected/51782d60-42d3-47e2-aadb-dee3c07d518c-kube-api-access-lmz4s\") pod \"redhat-operators-k2b7f\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.042047 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.083569 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" event={"ID":"a34690c3-fe9d-4dac-841c-07298c80c0e8","Type":"ContainerDied","Data":"b3222c45cd140d66632335548cfaf0c9b7ad69ddd5af84927930cfff145f0fe2"} Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.083607 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3222c45cd140d66632335548cfaf0c9b7ad69ddd5af84927930cfff145f0fe2" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.083648 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-k9m29" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.190996 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25"] Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.216090 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25"] Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.216244 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.221283 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.221446 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.221559 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.222998 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.223139 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftcnf\" (UniqueName: \"kubernetes.io/projected/62a60a41-35bc-45db-91c0-feb1ec993942-kube-api-access-ftcnf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.223183 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.223358 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.325639 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftcnf\" (UniqueName: \"kubernetes.io/projected/62a60a41-35bc-45db-91c0-feb1ec993942-kube-api-access-ftcnf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.325723 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.326134 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.331203 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.331398 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.348614 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftcnf\" (UniqueName: \"kubernetes.io/projected/62a60a41-35bc-45db-91c0-feb1ec993942-kube-api-access-ftcnf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:44.542421 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:48.928549 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:46:48 crc kubenswrapper[4702]: E1203 11:46:48.929489 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:46:48 crc kubenswrapper[4702]: I1203 11:46:48.985039 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k2b7f"] Dec 03 11:46:49 crc kubenswrapper[4702]: I1203 11:46:49.090457 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25"] Dec 03 11:46:49 crc kubenswrapper[4702]: W1203 11:46:49.103394 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62a60a41_35bc_45db_91c0_feb1ec993942.slice/crio-7932881a49fbbf64de63e5052a3e5cf3cc556aad48b20cd905ba573813af6250 WatchSource:0}: Error finding container 7932881a49fbbf64de63e5052a3e5cf3cc556aad48b20cd905ba573813af6250: Status 404 returned error can't find the container with id 7932881a49fbbf64de63e5052a3e5cf3cc556aad48b20cd905ba573813af6250 Dec 03 11:46:49 crc kubenswrapper[4702]: I1203 11:46:49.152860 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2b7f" event={"ID":"51782d60-42d3-47e2-aadb-dee3c07d518c","Type":"ContainerStarted","Data":"e6d9950ea3f7aeef216542d973a10b4dbb17cee47611df5b81d76f9ee66e3ab9"} Dec 03 11:46:49 crc kubenswrapper[4702]: I1203 11:46:49.154850 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" event={"ID":"62a60a41-35bc-45db-91c0-feb1ec993942","Type":"ContainerStarted","Data":"7932881a49fbbf64de63e5052a3e5cf3cc556aad48b20cd905ba573813af6250"} Dec 03 11:46:50 crc kubenswrapper[4702]: I1203 11:46:50.166656 4702 generic.go:334] "Generic (PLEG): container finished" podID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerID="c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff" exitCode=0 Dec 03 11:46:50 crc kubenswrapper[4702]: I1203 11:46:50.166712 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2b7f" event={"ID":"51782d60-42d3-47e2-aadb-dee3c07d518c","Type":"ContainerDied","Data":"c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff"} Dec 03 11:46:51 crc kubenswrapper[4702]: I1203 11:46:51.181594 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" event={"ID":"62a60a41-35bc-45db-91c0-feb1ec993942","Type":"ContainerStarted","Data":"2c2be6bdb3457570f4889572f425a2e41a32ec9c1a16c625ef7128598af3cfd8"} Dec 03 11:46:51 crc kubenswrapper[4702]: I1203 11:46:51.202229 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" podStartSLOduration=6.004079188 podStartE2EDuration="7.202208103s" podCreationTimestamp="2025-12-03 11:46:44 +0000 UTC" firstStartedPulling="2025-12-03 11:46:49.107658416 +0000 UTC m=+2592.943586880" lastFinishedPulling="2025-12-03 11:46:50.305787331 +0000 UTC m=+2594.141715795" observedRunningTime="2025-12-03 11:46:51.200327969 +0000 UTC m=+2595.036256463" watchObservedRunningTime="2025-12-03 11:46:51.202208103 +0000 UTC m=+2595.038136567" Dec 03 11:46:52 crc kubenswrapper[4702]: I1203 11:46:52.194362 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2b7f" event={"ID":"51782d60-42d3-47e2-aadb-dee3c07d518c","Type":"ContainerStarted","Data":"9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933"} Dec 03 11:46:56 crc kubenswrapper[4702]: E1203 11:46:56.978605 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51782d60_42d3_47e2_aadb_dee3c07d518c.slice/crio-conmon-9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:46:57 crc kubenswrapper[4702]: I1203 11:46:57.259211 4702 generic.go:334] "Generic (PLEG): container finished" podID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerID="9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933" exitCode=0 Dec 03 11:46:57 crc kubenswrapper[4702]: I1203 11:46:57.259268 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2b7f" event={"ID":"51782d60-42d3-47e2-aadb-dee3c07d518c","Type":"ContainerDied","Data":"9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933"} Dec 03 11:46:58 crc kubenswrapper[4702]: I1203 11:46:58.305260 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2b7f" event={"ID":"51782d60-42d3-47e2-aadb-dee3c07d518c","Type":"ContainerStarted","Data":"f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17"} Dec 03 11:46:58 crc kubenswrapper[4702]: I1203 11:46:58.332335 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k2b7f" podStartSLOduration=7.680912852 podStartE2EDuration="15.332311814s" podCreationTimestamp="2025-12-03 11:46:43 +0000 UTC" firstStartedPulling="2025-12-03 11:46:50.230782177 +0000 UTC m=+2594.066710641" lastFinishedPulling="2025-12-03 11:46:57.882181139 +0000 UTC m=+2601.718109603" observedRunningTime="2025-12-03 11:46:58.32374731 +0000 UTC m=+2602.159675774" watchObservedRunningTime="2025-12-03 11:46:58.332311814 +0000 UTC m=+2602.168240268" Dec 03 11:47:01 crc kubenswrapper[4702]: I1203 11:47:01.928255 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:47:01 crc kubenswrapper[4702]: E1203 11:47:01.929448 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:47:02 crc kubenswrapper[4702]: I1203 11:47:02.352509 4702 generic.go:334] "Generic (PLEG): container finished" podID="62a60a41-35bc-45db-91c0-feb1ec993942" containerID="2c2be6bdb3457570f4889572f425a2e41a32ec9c1a16c625ef7128598af3cfd8" exitCode=0 Dec 03 11:47:02 crc kubenswrapper[4702]: I1203 11:47:02.352571 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" event={"ID":"62a60a41-35bc-45db-91c0-feb1ec993942","Type":"ContainerDied","Data":"2c2be6bdb3457570f4889572f425a2e41a32ec9c1a16c625ef7128598af3cfd8"} Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.042668 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.042952 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.066684 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.186495 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-ssh-key\") pod \"62a60a41-35bc-45db-91c0-feb1ec993942\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.186935 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftcnf\" (UniqueName: \"kubernetes.io/projected/62a60a41-35bc-45db-91c0-feb1ec993942-kube-api-access-ftcnf\") pod \"62a60a41-35bc-45db-91c0-feb1ec993942\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.187034 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-inventory\") pod \"62a60a41-35bc-45db-91c0-feb1ec993942\" (UID: \"62a60a41-35bc-45db-91c0-feb1ec993942\") " Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.202276 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a60a41-35bc-45db-91c0-feb1ec993942-kube-api-access-ftcnf" (OuterVolumeSpecName: "kube-api-access-ftcnf") pod "62a60a41-35bc-45db-91c0-feb1ec993942" (UID: "62a60a41-35bc-45db-91c0-feb1ec993942"). InnerVolumeSpecName "kube-api-access-ftcnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.242015 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-inventory" (OuterVolumeSpecName: "inventory") pod "62a60a41-35bc-45db-91c0-feb1ec993942" (UID: "62a60a41-35bc-45db-91c0-feb1ec993942"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.367331 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftcnf\" (UniqueName: \"kubernetes.io/projected/62a60a41-35bc-45db-91c0-feb1ec993942-kube-api-access-ftcnf\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.367378 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.368795 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "62a60a41-35bc-45db-91c0-feb1ec993942" (UID: "62a60a41-35bc-45db-91c0-feb1ec993942"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.404544 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" event={"ID":"62a60a41-35bc-45db-91c0-feb1ec993942","Type":"ContainerDied","Data":"7932881a49fbbf64de63e5052a3e5cf3cc556aad48b20cd905ba573813af6250"} Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.404596 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7932881a49fbbf64de63e5052a3e5cf3cc556aad48b20cd905ba573813af6250" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.404674 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.472081 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/62a60a41-35bc-45db-91c0-feb1ec993942-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.495091 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt"] Dec 03 11:47:04 crc kubenswrapper[4702]: E1203 11:47:04.495643 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a60a41-35bc-45db-91c0-feb1ec993942" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.495662 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a60a41-35bc-45db-91c0-feb1ec993942" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.495969 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a60a41-35bc-45db-91c0-feb1ec993942" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.496934 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.502525 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.502631 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.503314 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.503364 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.503480 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.503574 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.503609 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.503876 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.503963 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.522358 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt"] Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.677955 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678085 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678131 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678180 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678208 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678239 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678340 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678385 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678425 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678448 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678506 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678544 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdfb\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-kube-api-access-rcdfb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678571 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678649 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678683 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.678710 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.776610 4702 scope.go:117] "RemoveContainer" containerID="5a478d692d47e14d886868adc7feb0d40d44c76b680c276ad3b4f414b268636a" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.780783 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.780820 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.780875 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.780904 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdfb\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-kube-api-access-rcdfb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.780927 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.780980 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781015 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781036 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781080 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781124 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781154 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781188 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781207 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781229 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781319 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.781353 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.786389 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.786791 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.788922 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.791532 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.792463 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.792623 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.793019 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.794015 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.796206 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.801006 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdfb\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-kube-api-access-rcdfb\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.803876 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.804496 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.805459 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.813706 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.815555 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.821443 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-njkvt\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:04 crc kubenswrapper[4702]: I1203 11:47:04.838291 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:05 crc kubenswrapper[4702]: I1203 11:47:05.099138 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k2b7f" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="registry-server" probeResult="failure" output=< Dec 03 11:47:05 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:47:05 crc kubenswrapper[4702]: > Dec 03 11:47:05 crc kubenswrapper[4702]: I1203 11:47:05.446536 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt"] Dec 03 11:47:06 crc kubenswrapper[4702]: I1203 11:47:06.428747 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" event={"ID":"4064b583-7ee6-4ca3-9720-77129e43d3b9","Type":"ContainerStarted","Data":"2c1c4880271e3198063251ac948b7fd2035e255b409b34ba12a9bb1fcbf22fca"} Dec 03 11:47:06 crc kubenswrapper[4702]: I1203 11:47:06.429137 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" event={"ID":"4064b583-7ee6-4ca3-9720-77129e43d3b9","Type":"ContainerStarted","Data":"855a130162e00d5035a19e68103f12ec76f32ac39477e23ec5c6191a6e8e6b5f"} Dec 03 11:47:06 crc kubenswrapper[4702]: I1203 11:47:06.466634 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" podStartSLOduration=1.934758714 podStartE2EDuration="2.466617175s" podCreationTimestamp="2025-12-03 11:47:04 +0000 UTC" firstStartedPulling="2025-12-03 11:47:05.447202273 +0000 UTC m=+2609.283130737" lastFinishedPulling="2025-12-03 11:47:05.979060734 +0000 UTC m=+2609.814989198" observedRunningTime="2025-12-03 11:47:06.461716766 +0000 UTC m=+2610.297645230" watchObservedRunningTime="2025-12-03 11:47:06.466617175 +0000 UTC m=+2610.302545639" Dec 03 11:47:14 crc kubenswrapper[4702]: I1203 11:47:14.106935 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:47:14 crc kubenswrapper[4702]: I1203 11:47:14.180846 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:47:14 crc kubenswrapper[4702]: I1203 11:47:14.890863 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2b7f"] Dec 03 11:47:15 crc kubenswrapper[4702]: I1203 11:47:15.703804 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k2b7f" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="registry-server" containerID="cri-o://f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17" gracePeriod=2 Dec 03 11:47:15 crc kubenswrapper[4702]: I1203 11:47:15.928831 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:47:15 crc kubenswrapper[4702]: E1203 11:47:15.929604 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.218808 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.478910 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-catalog-content\") pod \"51782d60-42d3-47e2-aadb-dee3c07d518c\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.479293 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmz4s\" (UniqueName: \"kubernetes.io/projected/51782d60-42d3-47e2-aadb-dee3c07d518c-kube-api-access-lmz4s\") pod \"51782d60-42d3-47e2-aadb-dee3c07d518c\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.479415 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-utilities\") pod \"51782d60-42d3-47e2-aadb-dee3c07d518c\" (UID: \"51782d60-42d3-47e2-aadb-dee3c07d518c\") " Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.481177 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-utilities" (OuterVolumeSpecName: "utilities") pod "51782d60-42d3-47e2-aadb-dee3c07d518c" (UID: "51782d60-42d3-47e2-aadb-dee3c07d518c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.494605 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51782d60-42d3-47e2-aadb-dee3c07d518c-kube-api-access-lmz4s" (OuterVolumeSpecName: "kube-api-access-lmz4s") pod "51782d60-42d3-47e2-aadb-dee3c07d518c" (UID: "51782d60-42d3-47e2-aadb-dee3c07d518c"). InnerVolumeSpecName "kube-api-access-lmz4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.582993 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.583030 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmz4s\" (UniqueName: \"kubernetes.io/projected/51782d60-42d3-47e2-aadb-dee3c07d518c-kube-api-access-lmz4s\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.592594 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51782d60-42d3-47e2-aadb-dee3c07d518c" (UID: "51782d60-42d3-47e2-aadb-dee3c07d518c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.685870 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51782d60-42d3-47e2-aadb-dee3c07d518c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.724055 4702 generic.go:334] "Generic (PLEG): container finished" podID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerID="f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17" exitCode=0 Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.724181 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k2b7f" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.724183 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2b7f" event={"ID":"51782d60-42d3-47e2-aadb-dee3c07d518c","Type":"ContainerDied","Data":"f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17"} Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.724597 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k2b7f" event={"ID":"51782d60-42d3-47e2-aadb-dee3c07d518c","Type":"ContainerDied","Data":"e6d9950ea3f7aeef216542d973a10b4dbb17cee47611df5b81d76f9ee66e3ab9"} Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.724645 4702 scope.go:117] "RemoveContainer" containerID="f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.771497 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k2b7f"] Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.776089 4702 scope.go:117] "RemoveContainer" containerID="9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.784168 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k2b7f"] Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.819478 4702 scope.go:117] "RemoveContainer" containerID="c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.857073 4702 scope.go:117] "RemoveContainer" containerID="f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17" Dec 03 11:47:16 crc kubenswrapper[4702]: E1203 11:47:16.857436 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17\": container with ID starting with f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17 not found: ID does not exist" containerID="f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.857469 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17"} err="failed to get container status \"f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17\": rpc error: code = NotFound desc = could not find container \"f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17\": container with ID starting with f428e8765abcf924706a1b1da9c4e2dad3fb412e4bf72d2389082437bf2a8e17 not found: ID does not exist" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.857498 4702 scope.go:117] "RemoveContainer" containerID="9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933" Dec 03 11:47:16 crc kubenswrapper[4702]: E1203 11:47:16.857736 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933\": container with ID starting with 9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933 not found: ID does not exist" containerID="9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.857775 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933"} err="failed to get container status \"9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933\": rpc error: code = NotFound desc = could not find container \"9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933\": container with ID starting with 9782e0b7d05a04fcf9a211578bb1851d1e7f076a0cb5f1412758a079a71c4933 not found: ID does not exist" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.857788 4702 scope.go:117] "RemoveContainer" containerID="c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff" Dec 03 11:47:16 crc kubenswrapper[4702]: E1203 11:47:16.858075 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff\": container with ID starting with c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff not found: ID does not exist" containerID="c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.858093 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff"} err="failed to get container status \"c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff\": rpc error: code = NotFound desc = could not find container \"c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff\": container with ID starting with c1120e15c0459df05fcc9b072c22d4614495ba84026d3a9a60829d06f61560ff not found: ID does not exist" Dec 03 11:47:16 crc kubenswrapper[4702]: I1203 11:47:16.945988 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" path="/var/lib/kubelet/pods/51782d60-42d3-47e2-aadb-dee3c07d518c/volumes" Dec 03 11:47:29 crc kubenswrapper[4702]: I1203 11:47:29.929005 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:47:29 crc kubenswrapper[4702]: E1203 11:47:29.929885 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:47:44 crc kubenswrapper[4702]: I1203 11:47:44.929270 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:47:44 crc kubenswrapper[4702]: E1203 11:47:44.930321 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:47:56 crc kubenswrapper[4702]: I1203 11:47:56.928992 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:47:56 crc kubenswrapper[4702]: E1203 11:47:56.930308 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:47:57 crc kubenswrapper[4702]: I1203 11:47:57.630205 4702 generic.go:334] "Generic (PLEG): container finished" podID="4064b583-7ee6-4ca3-9720-77129e43d3b9" containerID="2c1c4880271e3198063251ac948b7fd2035e255b409b34ba12a9bb1fcbf22fca" exitCode=0 Dec 03 11:47:57 crc kubenswrapper[4702]: I1203 11:47:57.630273 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" event={"ID":"4064b583-7ee6-4ca3-9720-77129e43d3b9","Type":"ContainerDied","Data":"2c1c4880271e3198063251ac948b7fd2035e255b409b34ba12a9bb1fcbf22fca"} Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.200556 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.360471 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.360655 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.360701 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-neutron-metadata-combined-ca-bundle\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.360775 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-bootstrap-combined-ca-bundle\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.360912 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-ovn-default-certs-0\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.360943 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-nova-combined-ca-bundle\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.360980 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361017 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcdfb\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-kube-api-access-rcdfb\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361061 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-repo-setup-combined-ca-bundle\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361092 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ssh-key\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361144 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-inventory\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361213 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ovn-combined-ca-bundle\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361259 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-power-monitoring-combined-ca-bundle\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361304 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361346 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-libvirt-combined-ca-bundle\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.361383 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-combined-ca-bundle\") pod \"4064b583-7ee6-4ca3-9720-77129e43d3b9\" (UID: \"4064b583-7ee6-4ca3-9720-77129e43d3b9\") " Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.369600 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.370767 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.370854 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.372001 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.373539 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.374750 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.375457 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.375491 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-kube-api-access-rcdfb" (OuterVolumeSpecName: "kube-api-access-rcdfb") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "kube-api-access-rcdfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.375711 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.376373 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.376410 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.377426 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.377834 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.379121 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587528 4702 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587572 4702 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587590 4702 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587605 4702 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587617 4702 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587628 4702 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587639 4702 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587653 4702 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587667 4702 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587678 4702 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587690 4702 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587701 4702 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587712 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcdfb\" (UniqueName: \"kubernetes.io/projected/4064b583-7ee6-4ca3-9720-77129e43d3b9-kube-api-access-rcdfb\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.587724 4702 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.595869 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.606564 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-inventory" (OuterVolumeSpecName: "inventory") pod "4064b583-7ee6-4ca3-9720-77129e43d3b9" (UID: "4064b583-7ee6-4ca3-9720-77129e43d3b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.654637 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" event={"ID":"4064b583-7ee6-4ca3-9720-77129e43d3b9","Type":"ContainerDied","Data":"855a130162e00d5035a19e68103f12ec76f32ac39477e23ec5c6191a6e8e6b5f"} Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.654687 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="855a130162e00d5035a19e68103f12ec76f32ac39477e23ec5c6191a6e8e6b5f" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.654738 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-njkvt" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.689377 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.689427 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4064b583-7ee6-4ca3-9720-77129e43d3b9-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.769341 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh"] Dec 03 11:47:59 crc kubenswrapper[4702]: E1203 11:47:59.769989 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4064b583-7ee6-4ca3-9720-77129e43d3b9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.770024 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4064b583-7ee6-4ca3-9720-77129e43d3b9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 11:47:59 crc kubenswrapper[4702]: E1203 11:47:59.770048 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="registry-server" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.770054 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="registry-server" Dec 03 11:47:59 crc kubenswrapper[4702]: E1203 11:47:59.770091 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="extract-content" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.770097 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="extract-content" Dec 03 11:47:59 crc kubenswrapper[4702]: E1203 11:47:59.770118 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="extract-utilities" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.770124 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="extract-utilities" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.770393 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4064b583-7ee6-4ca3-9720-77129e43d3b9" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.770426 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="51782d60-42d3-47e2-aadb-dee3c07d518c" containerName="registry-server" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.773529 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.775574 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.775884 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.776020 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.776401 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.776498 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.783870 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh"] Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.796749 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.796835 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xpd\" (UniqueName: \"kubernetes.io/projected/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-kube-api-access-l2xpd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.796881 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.796987 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.797180 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.899723 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.899929 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.900011 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.900475 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xpd\" (UniqueName: \"kubernetes.io/projected/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-kube-api-access-l2xpd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.900526 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.900909 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.904636 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.904671 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.916897 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:47:59 crc kubenswrapper[4702]: I1203 11:47:59.922739 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xpd\" (UniqueName: \"kubernetes.io/projected/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-kube-api-access-l2xpd\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-jwdsh\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:48:00 crc kubenswrapper[4702]: I1203 11:48:00.096667 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:48:00 crc kubenswrapper[4702]: I1203 11:48:00.747381 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh"] Dec 03 11:48:01 crc kubenswrapper[4702]: I1203 11:48:01.698770 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" event={"ID":"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4","Type":"ContainerStarted","Data":"f205ebccdd1fe6bcb82d38c068877aad923a636a63ef65e150aa06fa9d80ced6"} Dec 03 11:48:01 crc kubenswrapper[4702]: I1203 11:48:01.699221 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" event={"ID":"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4","Type":"ContainerStarted","Data":"b1db1e26079cafe8061c37cc66fee94be3993bc010994bc5212b9f7edccf007f"} Dec 03 11:48:01 crc kubenswrapper[4702]: I1203 11:48:01.726622 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" podStartSLOduration=2.114620472 podStartE2EDuration="2.726595485s" podCreationTimestamp="2025-12-03 11:47:59 +0000 UTC" firstStartedPulling="2025-12-03 11:48:00.763822954 +0000 UTC m=+2664.599751418" lastFinishedPulling="2025-12-03 11:48:01.375797957 +0000 UTC m=+2665.211726431" observedRunningTime="2025-12-03 11:48:01.720240694 +0000 UTC m=+2665.556169168" watchObservedRunningTime="2025-12-03 11:48:01.726595485 +0000 UTC m=+2665.562523949" Dec 03 11:48:09 crc kubenswrapper[4702]: I1203 11:48:09.929118 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:48:09 crc kubenswrapper[4702]: E1203 11:48:09.929943 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:48:22 crc kubenswrapper[4702]: I1203 11:48:22.929793 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:48:22 crc kubenswrapper[4702]: E1203 11:48:22.930571 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:48:34 crc kubenswrapper[4702]: I1203 11:48:34.928978 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:48:34 crc kubenswrapper[4702]: E1203 11:48:34.929913 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:48:46 crc kubenswrapper[4702]: I1203 11:48:46.963358 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:48:46 crc kubenswrapper[4702]: E1203 11:48:46.970889 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:49:01 crc kubenswrapper[4702]: I1203 11:49:01.929240 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:49:01 crc kubenswrapper[4702]: E1203 11:49:01.931394 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:49:09 crc kubenswrapper[4702]: I1203 11:49:09.616024 4702 generic.go:334] "Generic (PLEG): container finished" podID="81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" containerID="f205ebccdd1fe6bcb82d38c068877aad923a636a63ef65e150aa06fa9d80ced6" exitCode=0 Dec 03 11:49:09 crc kubenswrapper[4702]: I1203 11:49:09.616111 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" event={"ID":"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4","Type":"ContainerDied","Data":"f205ebccdd1fe6bcb82d38c068877aad923a636a63ef65e150aa06fa9d80ced6"} Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.569691 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.625234 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ssh-key\") pod \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.625432 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2xpd\" (UniqueName: \"kubernetes.io/projected/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-kube-api-access-l2xpd\") pod \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.625638 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-inventory\") pod \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.625672 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovncontroller-config-0\") pod \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.625721 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovn-combined-ca-bundle\") pod \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\" (UID: \"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4\") " Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.635677 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" (UID: "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.642639 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-kube-api-access-l2xpd" (OuterVolumeSpecName: "kube-api-access-l2xpd") pod "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" (UID: "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4"). InnerVolumeSpecName "kube-api-access-l2xpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.666402 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" event={"ID":"81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4","Type":"ContainerDied","Data":"b1db1e26079cafe8061c37cc66fee94be3993bc010994bc5212b9f7edccf007f"} Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.666472 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1db1e26079cafe8061c37cc66fee94be3993bc010994bc5212b9f7edccf007f" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.666547 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-jwdsh" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.687841 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" (UID: "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.711196 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" (UID: "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.713815 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-inventory" (OuterVolumeSpecName: "inventory") pod "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" (UID: "81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.731923 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2xpd\" (UniqueName: \"kubernetes.io/projected/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-kube-api-access-l2xpd\") on node \"crc\" DevicePath \"\"" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.731953 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.731964 4702 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.731974 4702 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.731982 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.831067 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg"] Dec 03 11:49:11 crc kubenswrapper[4702]: E1203 11:49:11.831930 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.831957 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.832278 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.833454 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.835960 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.836234 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.845585 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg"] Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.936233 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqxmm\" (UniqueName: \"kubernetes.io/projected/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-kube-api-access-pqxmm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.936293 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.936324 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.936491 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.936566 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:11 crc kubenswrapper[4702]: I1203 11:49:11.936776 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.040525 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.040678 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.040822 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqxmm\" (UniqueName: \"kubernetes.io/projected/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-kube-api-access-pqxmm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.040882 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.040939 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.041599 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.049433 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.050864 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.052484 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.053672 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.057687 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.063416 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqxmm\" (UniqueName: \"kubernetes.io/projected/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-kube-api-access-pqxmm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.165976 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.782162 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg"] Dec 03 11:49:12 crc kubenswrapper[4702]: I1203 11:49:12.792083 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:49:13 crc kubenswrapper[4702]: I1203 11:49:13.692111 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" event={"ID":"b7bb0467-80d8-4cb2-a515-94eba3b4acdc","Type":"ContainerStarted","Data":"1de71c80b450a943649122c9cfe4f92fb656cf42d166ff72e8383de0ead52f92"} Dec 03 11:49:13 crc kubenswrapper[4702]: I1203 11:49:13.692610 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" event={"ID":"b7bb0467-80d8-4cb2-a515-94eba3b4acdc","Type":"ContainerStarted","Data":"59042177d7ff021da6a5e32f35cf539f79e2b5e9f133792a0e6a5cdf023634ea"} Dec 03 11:49:13 crc kubenswrapper[4702]: I1203 11:49:13.723686 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" podStartSLOduration=2.238640284 podStartE2EDuration="2.723644848s" podCreationTimestamp="2025-12-03 11:49:11 +0000 UTC" firstStartedPulling="2025-12-03 11:49:12.791700116 +0000 UTC m=+2736.627628580" lastFinishedPulling="2025-12-03 11:49:13.27670468 +0000 UTC m=+2737.112633144" observedRunningTime="2025-12-03 11:49:13.712954504 +0000 UTC m=+2737.548882998" watchObservedRunningTime="2025-12-03 11:49:13.723644848 +0000 UTC m=+2737.559573322" Dec 03 11:49:15 crc kubenswrapper[4702]: I1203 11:49:15.929899 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:49:15 crc kubenswrapper[4702]: E1203 11:49:15.930592 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:49:28 crc kubenswrapper[4702]: I1203 11:49:28.948335 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:49:29 crc kubenswrapper[4702]: I1203 11:49:29.907583 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"b72e22f3021fc7c15497f9bdd6c2f451d0898c94753e6384c425597e057fad0e"} Dec 03 11:50:06 crc kubenswrapper[4702]: I1203 11:50:06.508005 4702 generic.go:334] "Generic (PLEG): container finished" podID="b7bb0467-80d8-4cb2-a515-94eba3b4acdc" containerID="1de71c80b450a943649122c9cfe4f92fb656cf42d166ff72e8383de0ead52f92" exitCode=0 Dec 03 11:50:06 crc kubenswrapper[4702]: I1203 11:50:06.508062 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" event={"ID":"b7bb0467-80d8-4cb2-a515-94eba3b4acdc","Type":"ContainerDied","Data":"1de71c80b450a943649122c9cfe4f92fb656cf42d166ff72e8383de0ead52f92"} Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.214668 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.397332 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqxmm\" (UniqueName: \"kubernetes.io/projected/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-kube-api-access-pqxmm\") pod \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.397692 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-ssh-key\") pod \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.397747 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-inventory\") pod \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.397921 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-metadata-combined-ca-bundle\") pod \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.398153 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-nova-metadata-neutron-config-0\") pod \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.398214 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-ovn-metadata-agent-neutron-config-0\") pod \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\" (UID: \"b7bb0467-80d8-4cb2-a515-94eba3b4acdc\") " Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.405402 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-kube-api-access-pqxmm" (OuterVolumeSpecName: "kube-api-access-pqxmm") pod "b7bb0467-80d8-4cb2-a515-94eba3b4acdc" (UID: "b7bb0467-80d8-4cb2-a515-94eba3b4acdc"). InnerVolumeSpecName "kube-api-access-pqxmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.405798 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b7bb0467-80d8-4cb2-a515-94eba3b4acdc" (UID: "b7bb0467-80d8-4cb2-a515-94eba3b4acdc"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.440881 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "b7bb0467-80d8-4cb2-a515-94eba3b4acdc" (UID: "b7bb0467-80d8-4cb2-a515-94eba3b4acdc"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.442866 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "b7bb0467-80d8-4cb2-a515-94eba3b4acdc" (UID: "b7bb0467-80d8-4cb2-a515-94eba3b4acdc"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.449266 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-inventory" (OuterVolumeSpecName: "inventory") pod "b7bb0467-80d8-4cb2-a515-94eba3b4acdc" (UID: "b7bb0467-80d8-4cb2-a515-94eba3b4acdc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.460941 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b7bb0467-80d8-4cb2-a515-94eba3b4acdc" (UID: "b7bb0467-80d8-4cb2-a515-94eba3b4acdc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.513953 4702 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.514012 4702 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.514033 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqxmm\" (UniqueName: \"kubernetes.io/projected/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-kube-api-access-pqxmm\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.514051 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.514067 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.514086 4702 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7bb0467-80d8-4cb2-a515-94eba3b4acdc-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.709981 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" event={"ID":"b7bb0467-80d8-4cb2-a515-94eba3b4acdc","Type":"ContainerDied","Data":"59042177d7ff021da6a5e32f35cf539f79e2b5e9f133792a0e6a5cdf023634ea"} Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.710022 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59042177d7ff021da6a5e32f35cf539f79e2b5e9f133792a0e6a5cdf023634ea" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.710082 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.808702 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4"] Dec 03 11:50:08 crc kubenswrapper[4702]: E1203 11:50:08.809896 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7bb0467-80d8-4cb2-a515-94eba3b4acdc" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.809933 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7bb0467-80d8-4cb2-a515-94eba3b4acdc" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.810503 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7bb0467-80d8-4cb2-a515-94eba3b4acdc" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.812348 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.815178 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.815273 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.816539 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.818361 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.825393 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.825555 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4"] Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.924097 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw4kb\" (UniqueName: \"kubernetes.io/projected/145c8ea5-5aaf-4017-8416-e346f9c95523-kube-api-access-nw4kb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.924263 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.924423 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.924540 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:08 crc kubenswrapper[4702]: I1203 11:50:08.924605 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.026987 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.027244 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw4kb\" (UniqueName: \"kubernetes.io/projected/145c8ea5-5aaf-4017-8416-e346f9c95523-kube-api-access-nw4kb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.027367 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.027540 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.027625 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.031825 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.032316 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.032419 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.049423 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.049646 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw4kb\" (UniqueName: \"kubernetes.io/projected/145c8ea5-5aaf-4017-8416-e346f9c95523-kube-api-access-nw4kb\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.389731 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:50:09 crc kubenswrapper[4702]: I1203 11:50:09.958308 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4"] Dec 03 11:50:10 crc kubenswrapper[4702]: I1203 11:50:10.852952 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" event={"ID":"145c8ea5-5aaf-4017-8416-e346f9c95523","Type":"ContainerStarted","Data":"ba0e5e63c91c1cd09366ea9e45a606ac75190020723f1e6bb9b16413b9f7b6cb"} Dec 03 11:50:11 crc kubenswrapper[4702]: I1203 11:50:11.975281 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" event={"ID":"145c8ea5-5aaf-4017-8416-e346f9c95523","Type":"ContainerStarted","Data":"2f3714366c1ae0dc0613982b2db13f94a760e3df14bcfb7f6242c6fb3a8ab162"} Dec 03 11:50:12 crc kubenswrapper[4702]: I1203 11:50:12.008106 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" podStartSLOduration=3.505590765 podStartE2EDuration="4.008082387s" podCreationTimestamp="2025-12-03 11:50:08 +0000 UTC" firstStartedPulling="2025-12-03 11:50:09.960124937 +0000 UTC m=+2793.796053391" lastFinishedPulling="2025-12-03 11:50:10.462616549 +0000 UTC m=+2794.298545013" observedRunningTime="2025-12-03 11:50:11.997564277 +0000 UTC m=+2795.833492751" watchObservedRunningTime="2025-12-03 11:50:12.008082387 +0000 UTC m=+2795.844010851" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.335440 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-k4jxc"] Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.343322 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.351275 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4jxc"] Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.406166 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-catalog-content\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.406514 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-utilities\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.406946 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sv8s\" (UniqueName: \"kubernetes.io/projected/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-kube-api-access-6sv8s\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.508783 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sv8s\" (UniqueName: \"kubernetes.io/projected/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-kube-api-access-6sv8s\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.508949 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-catalog-content\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.509064 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-utilities\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.510051 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-catalog-content\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.510070 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-utilities\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.528648 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sv8s\" (UniqueName: \"kubernetes.io/projected/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-kube-api-access-6sv8s\") pod \"certified-operators-k4jxc\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:24 crc kubenswrapper[4702]: I1203 11:50:24.679592 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:25 crc kubenswrapper[4702]: I1203 11:50:25.323593 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-k4jxc"] Dec 03 11:50:25 crc kubenswrapper[4702]: I1203 11:50:25.573864 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4jxc" event={"ID":"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d","Type":"ContainerStarted","Data":"861e18e8026a7a851ddccb75355748c128611f17edbe2c61d37927c06622bdb6"} Dec 03 11:50:26 crc kubenswrapper[4702]: I1203 11:50:26.589507 4702 generic.go:334] "Generic (PLEG): container finished" podID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerID="24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e" exitCode=0 Dec 03 11:50:26 crc kubenswrapper[4702]: I1203 11:50:26.589633 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4jxc" event={"ID":"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d","Type":"ContainerDied","Data":"24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e"} Dec 03 11:50:28 crc kubenswrapper[4702]: I1203 11:50:28.620155 4702 generic.go:334] "Generic (PLEG): container finished" podID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerID="2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d" exitCode=0 Dec 03 11:50:28 crc kubenswrapper[4702]: I1203 11:50:28.620225 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4jxc" event={"ID":"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d","Type":"ContainerDied","Data":"2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d"} Dec 03 11:50:29 crc kubenswrapper[4702]: I1203 11:50:29.632424 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4jxc" event={"ID":"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d","Type":"ContainerStarted","Data":"600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df"} Dec 03 11:50:29 crc kubenswrapper[4702]: I1203 11:50:29.653847 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-k4jxc" podStartSLOduration=3.17354822 podStartE2EDuration="5.653808942s" podCreationTimestamp="2025-12-03 11:50:24 +0000 UTC" firstStartedPulling="2025-12-03 11:50:26.592437838 +0000 UTC m=+2810.428366332" lastFinishedPulling="2025-12-03 11:50:29.07269859 +0000 UTC m=+2812.908627054" observedRunningTime="2025-12-03 11:50:29.652563117 +0000 UTC m=+2813.488491591" watchObservedRunningTime="2025-12-03 11:50:29.653808942 +0000 UTC m=+2813.489737416" Dec 03 11:50:34 crc kubenswrapper[4702]: I1203 11:50:34.679796 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:34 crc kubenswrapper[4702]: I1203 11:50:34.680378 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:34 crc kubenswrapper[4702]: I1203 11:50:34.733627 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:35 crc kubenswrapper[4702]: I1203 11:50:35.795196 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:35 crc kubenswrapper[4702]: I1203 11:50:35.913686 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4jxc"] Dec 03 11:50:37 crc kubenswrapper[4702]: I1203 11:50:37.748931 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-k4jxc" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerName="registry-server" containerID="cri-o://600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df" gracePeriod=2 Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.255892 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.511829 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-utilities\") pod \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.512468 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-catalog-content\") pod \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.512684 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sv8s\" (UniqueName: \"kubernetes.io/projected/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-kube-api-access-6sv8s\") pod \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\" (UID: \"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d\") " Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.515772 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-utilities" (OuterVolumeSpecName: "utilities") pod "5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" (UID: "5f1d71de-7a6c-4169-a043-fe4d9d7ac13d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.521509 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-kube-api-access-6sv8s" (OuterVolumeSpecName: "kube-api-access-6sv8s") pod "5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" (UID: "5f1d71de-7a6c-4169-a043-fe4d9d7ac13d"). InnerVolumeSpecName "kube-api-access-6sv8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.567242 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" (UID: "5f1d71de-7a6c-4169-a043-fe4d9d7ac13d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.616679 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.616723 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sv8s\" (UniqueName: \"kubernetes.io/projected/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-kube-api-access-6sv8s\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.616738 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.762633 4702 generic.go:334] "Generic (PLEG): container finished" podID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerID="600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df" exitCode=0 Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.762680 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4jxc" event={"ID":"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d","Type":"ContainerDied","Data":"600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df"} Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.762744 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-k4jxc" event={"ID":"5f1d71de-7a6c-4169-a043-fe4d9d7ac13d","Type":"ContainerDied","Data":"861e18e8026a7a851ddccb75355748c128611f17edbe2c61d37927c06622bdb6"} Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.762784 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-k4jxc" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.762794 4702 scope.go:117] "RemoveContainer" containerID="600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.796063 4702 scope.go:117] "RemoveContainer" containerID="2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.810861 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-k4jxc"] Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.823278 4702 scope.go:117] "RemoveContainer" containerID="24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.829026 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-k4jxc"] Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.875428 4702 scope.go:117] "RemoveContainer" containerID="600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df" Dec 03 11:50:38 crc kubenswrapper[4702]: E1203 11:50:38.876165 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df\": container with ID starting with 600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df not found: ID does not exist" containerID="600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.876213 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df"} err="failed to get container status \"600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df\": rpc error: code = NotFound desc = could not find container \"600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df\": container with ID starting with 600770a4161cc4c3148c53791ef979e14e111ec3622722112e3b1053ee20a5df not found: ID does not exist" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.876267 4702 scope.go:117] "RemoveContainer" containerID="2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d" Dec 03 11:50:38 crc kubenswrapper[4702]: E1203 11:50:38.876731 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d\": container with ID starting with 2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d not found: ID does not exist" containerID="2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.876793 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d"} err="failed to get container status \"2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d\": rpc error: code = NotFound desc = could not find container \"2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d\": container with ID starting with 2fd9749177f96d2c9acfafa79017db5d7e4bee28e77c71b746151533439ee61d not found: ID does not exist" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.876879 4702 scope.go:117] "RemoveContainer" containerID="24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e" Dec 03 11:50:38 crc kubenswrapper[4702]: E1203 11:50:38.877876 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e\": container with ID starting with 24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e not found: ID does not exist" containerID="24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.877920 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e"} err="failed to get container status \"24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e\": rpc error: code = NotFound desc = could not find container \"24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e\": container with ID starting with 24feb22cba24f78c011db30b89e0bf1bca3f92517a27a9da718a210766db829e not found: ID does not exist" Dec 03 11:50:38 crc kubenswrapper[4702]: I1203 11:50:38.943396 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" path="/var/lib/kubelet/pods/5f1d71de-7a6c-4169-a043-fe4d9d7ac13d/volumes" Dec 03 11:51:31 crc kubenswrapper[4702]: I1203 11:51:31.971394 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wwt79"] Dec 03 11:51:31 crc kubenswrapper[4702]: E1203 11:51:31.972656 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerName="registry-server" Dec 03 11:51:31 crc kubenswrapper[4702]: I1203 11:51:31.972678 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerName="registry-server" Dec 03 11:51:31 crc kubenswrapper[4702]: E1203 11:51:31.972696 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerName="extract-utilities" Dec 03 11:51:31 crc kubenswrapper[4702]: I1203 11:51:31.972704 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerName="extract-utilities" Dec 03 11:51:31 crc kubenswrapper[4702]: E1203 11:51:31.972722 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerName="extract-content" Dec 03 11:51:31 crc kubenswrapper[4702]: I1203 11:51:31.972731 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerName="extract-content" Dec 03 11:51:31 crc kubenswrapper[4702]: I1203 11:51:31.973116 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f1d71de-7a6c-4169-a043-fe4d9d7ac13d" containerName="registry-server" Dec 03 11:51:31 crc kubenswrapper[4702]: I1203 11:51:31.975775 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:31 crc kubenswrapper[4702]: I1203 11:51:31.986251 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wwt79"] Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.156224 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt8s5\" (UniqueName: \"kubernetes.io/projected/4f54bc61-e744-4030-831e-5e156bf71885-kube-api-access-bt8s5\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.156451 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-catalog-content\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.156593 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-utilities\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.260006 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-catalog-content\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.260125 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-utilities\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.260456 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt8s5\" (UniqueName: \"kubernetes.io/projected/4f54bc61-e744-4030-831e-5e156bf71885-kube-api-access-bt8s5\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.260804 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-utilities\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.260806 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-catalog-content\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.282415 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt8s5\" (UniqueName: \"kubernetes.io/projected/4f54bc61-e744-4030-831e-5e156bf71885-kube-api-access-bt8s5\") pod \"community-operators-wwt79\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.299603 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:32 crc kubenswrapper[4702]: I1203 11:51:32.905772 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wwt79"] Dec 03 11:51:33 crc kubenswrapper[4702]: I1203 11:51:33.618435 4702 generic.go:334] "Generic (PLEG): container finished" podID="4f54bc61-e744-4030-831e-5e156bf71885" containerID="7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99" exitCode=0 Dec 03 11:51:33 crc kubenswrapper[4702]: I1203 11:51:33.618707 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwt79" event={"ID":"4f54bc61-e744-4030-831e-5e156bf71885","Type":"ContainerDied","Data":"7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99"} Dec 03 11:51:33 crc kubenswrapper[4702]: I1203 11:51:33.618819 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwt79" event={"ID":"4f54bc61-e744-4030-831e-5e156bf71885","Type":"ContainerStarted","Data":"6d201caaa0b10e107b91a363a44eaadc00caa67c3a5d4cbae8e55ea6cca04719"} Dec 03 11:51:37 crc kubenswrapper[4702]: I1203 11:51:37.678515 4702 generic.go:334] "Generic (PLEG): container finished" podID="4f54bc61-e744-4030-831e-5e156bf71885" containerID="12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4" exitCode=0 Dec 03 11:51:37 crc kubenswrapper[4702]: I1203 11:51:37.678631 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwt79" event={"ID":"4f54bc61-e744-4030-831e-5e156bf71885","Type":"ContainerDied","Data":"12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4"} Dec 03 11:51:41 crc kubenswrapper[4702]: I1203 11:51:41.730408 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwt79" event={"ID":"4f54bc61-e744-4030-831e-5e156bf71885","Type":"ContainerStarted","Data":"ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169"} Dec 03 11:51:41 crc kubenswrapper[4702]: I1203 11:51:41.760676 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wwt79" podStartSLOduration=3.264559104 podStartE2EDuration="10.760630005s" podCreationTimestamp="2025-12-03 11:51:31 +0000 UTC" firstStartedPulling="2025-12-03 11:51:33.621332271 +0000 UTC m=+2877.457260745" lastFinishedPulling="2025-12-03 11:51:41.117403182 +0000 UTC m=+2884.953331646" observedRunningTime="2025-12-03 11:51:41.756095806 +0000 UTC m=+2885.592024300" watchObservedRunningTime="2025-12-03 11:51:41.760630005 +0000 UTC m=+2885.596558499" Dec 03 11:51:42 crc kubenswrapper[4702]: I1203 11:51:42.300367 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:42 crc kubenswrapper[4702]: I1203 11:51:42.300750 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:43 crc kubenswrapper[4702]: I1203 11:51:43.360062 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wwt79" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="registry-server" probeResult="failure" output=< Dec 03 11:51:43 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:51:43 crc kubenswrapper[4702]: > Dec 03 11:51:52 crc kubenswrapper[4702]: I1203 11:51:52.353477 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:52 crc kubenswrapper[4702]: I1203 11:51:52.417160 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:52 crc kubenswrapper[4702]: I1203 11:51:52.597448 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wwt79"] Dec 03 11:51:53 crc kubenswrapper[4702]: I1203 11:51:53.901565 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wwt79" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="registry-server" containerID="cri-o://ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169" gracePeriod=2 Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.461915 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.508090 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-utilities\") pod \"4f54bc61-e744-4030-831e-5e156bf71885\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.508240 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt8s5\" (UniqueName: \"kubernetes.io/projected/4f54bc61-e744-4030-831e-5e156bf71885-kube-api-access-bt8s5\") pod \"4f54bc61-e744-4030-831e-5e156bf71885\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.508311 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-catalog-content\") pod \"4f54bc61-e744-4030-831e-5e156bf71885\" (UID: \"4f54bc61-e744-4030-831e-5e156bf71885\") " Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.510495 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-utilities" (OuterVolumeSpecName: "utilities") pod "4f54bc61-e744-4030-831e-5e156bf71885" (UID: "4f54bc61-e744-4030-831e-5e156bf71885"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.522306 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f54bc61-e744-4030-831e-5e156bf71885-kube-api-access-bt8s5" (OuterVolumeSpecName: "kube-api-access-bt8s5") pod "4f54bc61-e744-4030-831e-5e156bf71885" (UID: "4f54bc61-e744-4030-831e-5e156bf71885"). InnerVolumeSpecName "kube-api-access-bt8s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.575377 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f54bc61-e744-4030-831e-5e156bf71885" (UID: "4f54bc61-e744-4030-831e-5e156bf71885"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.609620 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt8s5\" (UniqueName: \"kubernetes.io/projected/4f54bc61-e744-4030-831e-5e156bf71885-kube-api-access-bt8s5\") on node \"crc\" DevicePath \"\"" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.609653 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.609663 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f54bc61-e744-4030-831e-5e156bf71885-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.916210 4702 generic.go:334] "Generic (PLEG): container finished" podID="4f54bc61-e744-4030-831e-5e156bf71885" containerID="ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169" exitCode=0 Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.916266 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwt79" event={"ID":"4f54bc61-e744-4030-831e-5e156bf71885","Type":"ContainerDied","Data":"ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169"} Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.916284 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wwt79" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.916310 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wwt79" event={"ID":"4f54bc61-e744-4030-831e-5e156bf71885","Type":"ContainerDied","Data":"6d201caaa0b10e107b91a363a44eaadc00caa67c3a5d4cbae8e55ea6cca04719"} Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.916344 4702 scope.go:117] "RemoveContainer" containerID="ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.962169 4702 scope.go:117] "RemoveContainer" containerID="12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4" Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.989849 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wwt79"] Dec 03 11:51:54 crc kubenswrapper[4702]: I1203 11:51:54.999272 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wwt79"] Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.017907 4702 scope.go:117] "RemoveContainer" containerID="7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99" Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.087444 4702 scope.go:117] "RemoveContainer" containerID="ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169" Dec 03 11:51:55 crc kubenswrapper[4702]: E1203 11:51:55.090305 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169\": container with ID starting with ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169 not found: ID does not exist" containerID="ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169" Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.090363 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169"} err="failed to get container status \"ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169\": rpc error: code = NotFound desc = could not find container \"ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169\": container with ID starting with ecc1ddae63c13d2ee814ed45761e431d70bde3675db1db4a43049f5728862169 not found: ID does not exist" Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.090395 4702 scope.go:117] "RemoveContainer" containerID="12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4" Dec 03 11:51:55 crc kubenswrapper[4702]: E1203 11:51:55.090807 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4\": container with ID starting with 12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4 not found: ID does not exist" containerID="12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4" Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.090839 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4"} err="failed to get container status \"12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4\": rpc error: code = NotFound desc = could not find container \"12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4\": container with ID starting with 12fadf6b13fe89b035418292940fd13dfa58c10d655a380421b9b2944b30ded4 not found: ID does not exist" Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.090859 4702 scope.go:117] "RemoveContainer" containerID="7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99" Dec 03 11:51:55 crc kubenswrapper[4702]: E1203 11:51:55.091084 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99\": container with ID starting with 7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99 not found: ID does not exist" containerID="7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99" Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.091107 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99"} err="failed to get container status \"7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99\": rpc error: code = NotFound desc = could not find container \"7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99\": container with ID starting with 7113ef9fe9d463c07d2f09c3388191b609344393b6a7c96b6efa5af4301a7f99 not found: ID does not exist" Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.908061 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:51:55 crc kubenswrapper[4702]: I1203 11:51:55.908146 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:51:56 crc kubenswrapper[4702]: I1203 11:51:56.947227 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f54bc61-e744-4030-831e-5e156bf71885" path="/var/lib/kubelet/pods/4f54bc61-e744-4030-831e-5e156bf71885/volumes" Dec 03 11:52:25 crc kubenswrapper[4702]: I1203 11:52:25.907918 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:52:25 crc kubenswrapper[4702]: I1203 11:52:25.908538 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:52:55 crc kubenswrapper[4702]: I1203 11:52:55.907826 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:52:55 crc kubenswrapper[4702]: I1203 11:52:55.908345 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:52:55 crc kubenswrapper[4702]: I1203 11:52:55.908418 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:52:55 crc kubenswrapper[4702]: I1203 11:52:55.909436 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b72e22f3021fc7c15497f9bdd6c2f451d0898c94753e6384c425597e057fad0e"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:52:55 crc kubenswrapper[4702]: I1203 11:52:55.909509 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://b72e22f3021fc7c15497f9bdd6c2f451d0898c94753e6384c425597e057fad0e" gracePeriod=600 Dec 03 11:52:56 crc kubenswrapper[4702]: E1203 11:52:56.186453 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2e03cb6_21dc_460c_a68e_17aafd79e258.slice/crio-b72e22f3021fc7c15497f9bdd6c2f451d0898c94753e6384c425597e057fad0e.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:52:56 crc kubenswrapper[4702]: I1203 11:52:56.766843 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="b72e22f3021fc7c15497f9bdd6c2f451d0898c94753e6384c425597e057fad0e" exitCode=0 Dec 03 11:52:56 crc kubenswrapper[4702]: I1203 11:52:56.766917 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"b72e22f3021fc7c15497f9bdd6c2f451d0898c94753e6384c425597e057fad0e"} Dec 03 11:52:56 crc kubenswrapper[4702]: I1203 11:52:56.767191 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4"} Dec 03 11:52:56 crc kubenswrapper[4702]: I1203 11:52:56.767231 4702 scope.go:117] "RemoveContainer" containerID="f55e62cfcb8277d551297dc0f411a44b564d19d108c54bbc84fb839b2ea99f21" Dec 03 11:54:59 crc kubenswrapper[4702]: E1203 11:54:59.220471 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod145c8ea5_5aaf_4017_8416_e346f9c95523.slice/crio-2f3714366c1ae0dc0613982b2db13f94a760e3df14bcfb7f6242c6fb3a8ab162.scope\": RecentStats: unable to find data in memory cache]" Dec 03 11:54:59 crc kubenswrapper[4702]: I1203 11:54:59.745862 4702 generic.go:334] "Generic (PLEG): container finished" podID="145c8ea5-5aaf-4017-8416-e346f9c95523" containerID="2f3714366c1ae0dc0613982b2db13f94a760e3df14bcfb7f6242c6fb3a8ab162" exitCode=0 Dec 03 11:54:59 crc kubenswrapper[4702]: I1203 11:54:59.745974 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" event={"ID":"145c8ea5-5aaf-4017-8416-e346f9c95523","Type":"ContainerDied","Data":"2f3714366c1ae0dc0613982b2db13f94a760e3df14bcfb7f6242c6fb3a8ab162"} Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.598385 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.616316 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-secret-0\") pod \"145c8ea5-5aaf-4017-8416-e346f9c95523\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.671224 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "145c8ea5-5aaf-4017-8416-e346f9c95523" (UID: "145c8ea5-5aaf-4017-8416-e346f9c95523"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.718539 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-ssh-key\") pod \"145c8ea5-5aaf-4017-8416-e346f9c95523\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.718677 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-combined-ca-bundle\") pod \"145c8ea5-5aaf-4017-8416-e346f9c95523\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.719060 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-inventory\") pod \"145c8ea5-5aaf-4017-8416-e346f9c95523\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.719087 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw4kb\" (UniqueName: \"kubernetes.io/projected/145c8ea5-5aaf-4017-8416-e346f9c95523-kube-api-access-nw4kb\") pod \"145c8ea5-5aaf-4017-8416-e346f9c95523\" (UID: \"145c8ea5-5aaf-4017-8416-e346f9c95523\") " Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.719677 4702 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.725402 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/145c8ea5-5aaf-4017-8416-e346f9c95523-kube-api-access-nw4kb" (OuterVolumeSpecName: "kube-api-access-nw4kb") pod "145c8ea5-5aaf-4017-8416-e346f9c95523" (UID: "145c8ea5-5aaf-4017-8416-e346f9c95523"). InnerVolumeSpecName "kube-api-access-nw4kb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.729321 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "145c8ea5-5aaf-4017-8416-e346f9c95523" (UID: "145c8ea5-5aaf-4017-8416-e346f9c95523"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.764946 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-inventory" (OuterVolumeSpecName: "inventory") pod "145c8ea5-5aaf-4017-8416-e346f9c95523" (UID: "145c8ea5-5aaf-4017-8416-e346f9c95523"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.767041 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "145c8ea5-5aaf-4017-8416-e346f9c95523" (UID: "145c8ea5-5aaf-4017-8416-e346f9c95523"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.821677 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.821721 4702 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.821738 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/145c8ea5-5aaf-4017-8416-e346f9c95523-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.821750 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw4kb\" (UniqueName: \"kubernetes.io/projected/145c8ea5-5aaf-4017-8416-e346f9c95523-kube-api-access-nw4kb\") on node \"crc\" DevicePath \"\"" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.842340 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" event={"ID":"145c8ea5-5aaf-4017-8416-e346f9c95523","Type":"ContainerDied","Data":"ba0e5e63c91c1cd09366ea9e45a606ac75190020723f1e6bb9b16413b9f7b6cb"} Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.842422 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba0e5e63c91c1cd09366ea9e45a606ac75190020723f1e6bb9b16413b9f7b6cb" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.842500 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.915505 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr"] Dec 03 11:55:01 crc kubenswrapper[4702]: E1203 11:55:01.916217 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="extract-utilities" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.916236 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="extract-utilities" Dec 03 11:55:01 crc kubenswrapper[4702]: E1203 11:55:01.916259 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="registry-server" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.916266 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="registry-server" Dec 03 11:55:01 crc kubenswrapper[4702]: E1203 11:55:01.916280 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="extract-content" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.916287 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="extract-content" Dec 03 11:55:01 crc kubenswrapper[4702]: E1203 11:55:01.916303 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="145c8ea5-5aaf-4017-8416-e346f9c95523" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.916310 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="145c8ea5-5aaf-4017-8416-e346f9c95523" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.916553 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="145c8ea5-5aaf-4017-8416-e346f9c95523" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.916581 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f54bc61-e744-4030-831e-5e156bf71885" containerName="registry-server" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.917647 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.920259 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.920281 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.921113 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.921343 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.921630 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.921958 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.922224 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.923442 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.923494 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.923550 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.923595 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.923659 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.923730 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265fw\" (UniqueName: \"kubernetes.io/projected/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-kube-api-access-265fw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.924290 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.924415 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.924448 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:01 crc kubenswrapper[4702]: I1203 11:55:01.928401 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr"] Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.026593 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.026651 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.026676 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-265fw\" (UniqueName: \"kubernetes.io/projected/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-kube-api-access-265fw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.026835 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.026867 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.026900 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.026971 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.027010 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.027061 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.029039 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.031368 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.031547 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.031943 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.032474 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.032741 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.033101 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.033662 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.050566 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-265fw\" (UniqueName: \"kubernetes.io/projected/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-kube-api-access-265fw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lftxr\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.242913 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.820341 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr"] Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.827358 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 11:55:02 crc kubenswrapper[4702]: I1203 11:55:02.855166 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" event={"ID":"4332bde3-4d31-498c-8fd3-d1bc3d9e3794","Type":"ContainerStarted","Data":"a794a3621b11d5db0b448a7ee9b756c0ddaa5c9e379aa139ff9f064708d1f6fd"} Dec 03 11:55:04 crc kubenswrapper[4702]: I1203 11:55:04.948143 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" event={"ID":"4332bde3-4d31-498c-8fd3-d1bc3d9e3794","Type":"ContainerStarted","Data":"166118efb40a7e199d71d54fe4622b69eb6aa799b2f359d3aaf8aaa5a557e54d"} Dec 03 11:55:04 crc kubenswrapper[4702]: I1203 11:55:04.973173 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" podStartSLOduration=2.640168914 podStartE2EDuration="3.973137197s" podCreationTimestamp="2025-12-03 11:55:01 +0000 UTC" firstStartedPulling="2025-12-03 11:55:02.827029058 +0000 UTC m=+3086.662957522" lastFinishedPulling="2025-12-03 11:55:04.159997341 +0000 UTC m=+3087.995925805" observedRunningTime="2025-12-03 11:55:04.966157068 +0000 UTC m=+3088.802085542" watchObservedRunningTime="2025-12-03 11:55:04.973137197 +0000 UTC m=+3088.809065661" Dec 03 11:55:25 crc kubenswrapper[4702]: I1203 11:55:25.908584 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:55:25 crc kubenswrapper[4702]: I1203 11:55:25.909211 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:55:55 crc kubenswrapper[4702]: I1203 11:55:55.908443 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:55:55 crc kubenswrapper[4702]: I1203 11:55:55.909068 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:56:25 crc kubenswrapper[4702]: I1203 11:56:25.908062 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 11:56:25 crc kubenswrapper[4702]: I1203 11:56:25.908698 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 11:56:25 crc kubenswrapper[4702]: I1203 11:56:25.908814 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 11:56:25 crc kubenswrapper[4702]: I1203 11:56:25.910271 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 11:56:25 crc kubenswrapper[4702]: I1203 11:56:25.910411 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" gracePeriod=600 Dec 03 11:56:26 crc kubenswrapper[4702]: I1203 11:56:26.933509 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" exitCode=0 Dec 03 11:56:26 crc kubenswrapper[4702]: I1203 11:56:26.943826 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4"} Dec 03 11:56:26 crc kubenswrapper[4702]: I1203 11:56:26.943907 4702 scope.go:117] "RemoveContainer" containerID="b72e22f3021fc7c15497f9bdd6c2f451d0898c94753e6384c425597e057fad0e" Dec 03 11:56:27 crc kubenswrapper[4702]: E1203 11:56:27.138358 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:56:27 crc kubenswrapper[4702]: I1203 11:56:27.966541 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:56:27 crc kubenswrapper[4702]: E1203 11:56:27.966969 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.426727 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jfftz"] Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.431014 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.441557 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfftz"] Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.470418 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-utilities\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.470557 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-catalog-content\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.470685 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npcbk\" (UniqueName: \"kubernetes.io/projected/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-kube-api-access-npcbk\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.573624 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-utilities\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.573748 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-catalog-content\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.573884 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npcbk\" (UniqueName: \"kubernetes.io/projected/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-kube-api-access-npcbk\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.574317 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-utilities\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.574466 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-catalog-content\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.596237 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npcbk\" (UniqueName: \"kubernetes.io/projected/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-kube-api-access-npcbk\") pod \"redhat-marketplace-jfftz\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:33 crc kubenswrapper[4702]: I1203 11:56:33.777318 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:34 crc kubenswrapper[4702]: I1203 11:56:34.349740 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfftz"] Dec 03 11:56:35 crc kubenswrapper[4702]: I1203 11:56:35.059712 4702 generic.go:334] "Generic (PLEG): container finished" podID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerID="7b5f403430d2fe5fdcc5c951700a82b58633761a36db0ad2325e679df01d280d" exitCode=0 Dec 03 11:56:35 crc kubenswrapper[4702]: I1203 11:56:35.059789 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfftz" event={"ID":"14dc5c8c-6e58-4d35-b24b-ba560c404bb5","Type":"ContainerDied","Data":"7b5f403430d2fe5fdcc5c951700a82b58633761a36db0ad2325e679df01d280d"} Dec 03 11:56:35 crc kubenswrapper[4702]: I1203 11:56:35.060075 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfftz" event={"ID":"14dc5c8c-6e58-4d35-b24b-ba560c404bb5","Type":"ContainerStarted","Data":"c42ce006182ddca20deb7a3fca66d890ea1d79b762d715ea77def0b39ba4c7b9"} Dec 03 11:56:36 crc kubenswrapper[4702]: I1203 11:56:36.073792 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfftz" event={"ID":"14dc5c8c-6e58-4d35-b24b-ba560c404bb5","Type":"ContainerStarted","Data":"21d0458476f8e5214b9debd3dc4a484ef438a00a646b2a346214a5a18d2f5907"} Dec 03 11:56:37 crc kubenswrapper[4702]: I1203 11:56:37.089598 4702 generic.go:334] "Generic (PLEG): container finished" podID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerID="21d0458476f8e5214b9debd3dc4a484ef438a00a646b2a346214a5a18d2f5907" exitCode=0 Dec 03 11:56:37 crc kubenswrapper[4702]: I1203 11:56:37.089678 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfftz" event={"ID":"14dc5c8c-6e58-4d35-b24b-ba560c404bb5","Type":"ContainerDied","Data":"21d0458476f8e5214b9debd3dc4a484ef438a00a646b2a346214a5a18d2f5907"} Dec 03 11:56:39 crc kubenswrapper[4702]: I1203 11:56:39.929232 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:56:39 crc kubenswrapper[4702]: E1203 11:56:39.930170 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:56:40 crc kubenswrapper[4702]: I1203 11:56:40.125679 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfftz" event={"ID":"14dc5c8c-6e58-4d35-b24b-ba560c404bb5","Type":"ContainerStarted","Data":"809c7cda81f2aa029b6a97e84a619f41e221d9830ec14fe460f68123cb27ce0a"} Dec 03 11:56:40 crc kubenswrapper[4702]: I1203 11:56:40.150625 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jfftz" podStartSLOduration=2.369966055 podStartE2EDuration="7.150592624s" podCreationTimestamp="2025-12-03 11:56:33 +0000 UTC" firstStartedPulling="2025-12-03 11:56:35.062184694 +0000 UTC m=+3178.898113158" lastFinishedPulling="2025-12-03 11:56:39.842811263 +0000 UTC m=+3183.678739727" observedRunningTime="2025-12-03 11:56:40.144920712 +0000 UTC m=+3183.980849206" watchObservedRunningTime="2025-12-03 11:56:40.150592624 +0000 UTC m=+3183.986521088" Dec 03 11:56:43 crc kubenswrapper[4702]: I1203 11:56:43.778376 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:43 crc kubenswrapper[4702]: I1203 11:56:43.779896 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:43 crc kubenswrapper[4702]: I1203 11:56:43.930472 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:45 crc kubenswrapper[4702]: I1203 11:56:45.248474 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:45 crc kubenswrapper[4702]: I1203 11:56:45.306678 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfftz"] Dec 03 11:56:47 crc kubenswrapper[4702]: I1203 11:56:47.214454 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jfftz" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerName="registry-server" containerID="cri-o://809c7cda81f2aa029b6a97e84a619f41e221d9830ec14fe460f68123cb27ce0a" gracePeriod=2 Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.232162 4702 generic.go:334] "Generic (PLEG): container finished" podID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerID="809c7cda81f2aa029b6a97e84a619f41e221d9830ec14fe460f68123cb27ce0a" exitCode=0 Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.232221 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfftz" event={"ID":"14dc5c8c-6e58-4d35-b24b-ba560c404bb5","Type":"ContainerDied","Data":"809c7cda81f2aa029b6a97e84a619f41e221d9830ec14fe460f68123cb27ce0a"} Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.428293 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.431468 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-catalog-content\") pod \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.431657 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-utilities\") pod \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.431696 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npcbk\" (UniqueName: \"kubernetes.io/projected/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-kube-api-access-npcbk\") pod \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\" (UID: \"14dc5c8c-6e58-4d35-b24b-ba560c404bb5\") " Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.432613 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-utilities" (OuterVolumeSpecName: "utilities") pod "14dc5c8c-6e58-4d35-b24b-ba560c404bb5" (UID: "14dc5c8c-6e58-4d35-b24b-ba560c404bb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.434045 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.439786 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-kube-api-access-npcbk" (OuterVolumeSpecName: "kube-api-access-npcbk") pod "14dc5c8c-6e58-4d35-b24b-ba560c404bb5" (UID: "14dc5c8c-6e58-4d35-b24b-ba560c404bb5"). InnerVolumeSpecName "kube-api-access-npcbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.458398 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14dc5c8c-6e58-4d35-b24b-ba560c404bb5" (UID: "14dc5c8c-6e58-4d35-b24b-ba560c404bb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.536663 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:56:48 crc kubenswrapper[4702]: I1203 11:56:48.537261 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npcbk\" (UniqueName: \"kubernetes.io/projected/14dc5c8c-6e58-4d35-b24b-ba560c404bb5-kube-api-access-npcbk\") on node \"crc\" DevicePath \"\"" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.041737 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tbm5x"] Dec 03 11:56:49 crc kubenswrapper[4702]: E1203 11:56:49.043039 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerName="extract-utilities" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.043088 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerName="extract-utilities" Dec 03 11:56:49 crc kubenswrapper[4702]: E1203 11:56:49.043107 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerName="registry-server" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.043120 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerName="registry-server" Dec 03 11:56:49 crc kubenswrapper[4702]: E1203 11:56:49.043191 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerName="extract-content" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.043201 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerName="extract-content" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.043515 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" containerName="registry-server" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.051941 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.058289 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbm5x"] Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.150544 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-utilities\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.150618 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvjdj\" (UniqueName: \"kubernetes.io/projected/9c3cdd23-5d62-494a-a075-21087c238711-kube-api-access-rvjdj\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.151211 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-catalog-content\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.246576 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jfftz" event={"ID":"14dc5c8c-6e58-4d35-b24b-ba560c404bb5","Type":"ContainerDied","Data":"c42ce006182ddca20deb7a3fca66d890ea1d79b762d715ea77def0b39ba4c7b9"} Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.246630 4702 scope.go:117] "RemoveContainer" containerID="809c7cda81f2aa029b6a97e84a619f41e221d9830ec14fe460f68123cb27ce0a" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.247730 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jfftz" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.263007 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-utilities\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.263215 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvjdj\" (UniqueName: \"kubernetes.io/projected/9c3cdd23-5d62-494a-a075-21087c238711-kube-api-access-rvjdj\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.263696 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-utilities\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.263717 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-catalog-content\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.264074 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-catalog-content\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.304768 4702 scope.go:117] "RemoveContainer" containerID="21d0458476f8e5214b9debd3dc4a484ef438a00a646b2a346214a5a18d2f5907" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.304994 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfftz"] Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.318171 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvjdj\" (UniqueName: \"kubernetes.io/projected/9c3cdd23-5d62-494a-a075-21087c238711-kube-api-access-rvjdj\") pod \"redhat-operators-tbm5x\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.322311 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jfftz"] Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.368083 4702 scope.go:117] "RemoveContainer" containerID="7b5f403430d2fe5fdcc5c951700a82b58633761a36db0ad2325e679df01d280d" Dec 03 11:56:49 crc kubenswrapper[4702]: I1203 11:56:49.380200 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:50 crc kubenswrapper[4702]: I1203 11:56:50.100396 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tbm5x"] Dec 03 11:56:50 crc kubenswrapper[4702]: I1203 11:56:50.263481 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm5x" event={"ID":"9c3cdd23-5d62-494a-a075-21087c238711","Type":"ContainerStarted","Data":"4944ebb02b693e3b0b3fc5b9c09ae84a040110eb721c9fcc21be793802cdf393"} Dec 03 11:56:50 crc kubenswrapper[4702]: I1203 11:56:50.950292 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14dc5c8c-6e58-4d35-b24b-ba560c404bb5" path="/var/lib/kubelet/pods/14dc5c8c-6e58-4d35-b24b-ba560c404bb5/volumes" Dec 03 11:56:51 crc kubenswrapper[4702]: I1203 11:56:51.277013 4702 generic.go:334] "Generic (PLEG): container finished" podID="9c3cdd23-5d62-494a-a075-21087c238711" containerID="94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46" exitCode=0 Dec 03 11:56:51 crc kubenswrapper[4702]: I1203 11:56:51.277065 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm5x" event={"ID":"9c3cdd23-5d62-494a-a075-21087c238711","Type":"ContainerDied","Data":"94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46"} Dec 03 11:56:51 crc kubenswrapper[4702]: I1203 11:56:51.928719 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:56:51 crc kubenswrapper[4702]: E1203 11:56:51.929079 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:56:53 crc kubenswrapper[4702]: I1203 11:56:53.308316 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm5x" event={"ID":"9c3cdd23-5d62-494a-a075-21087c238711","Type":"ContainerStarted","Data":"a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9"} Dec 03 11:56:57 crc kubenswrapper[4702]: I1203 11:56:57.359226 4702 generic.go:334] "Generic (PLEG): container finished" podID="9c3cdd23-5d62-494a-a075-21087c238711" containerID="a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9" exitCode=0 Dec 03 11:56:57 crc kubenswrapper[4702]: I1203 11:56:57.359295 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm5x" event={"ID":"9c3cdd23-5d62-494a-a075-21087c238711","Type":"ContainerDied","Data":"a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9"} Dec 03 11:56:58 crc kubenswrapper[4702]: I1203 11:56:58.373404 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm5x" event={"ID":"9c3cdd23-5d62-494a-a075-21087c238711","Type":"ContainerStarted","Data":"5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729"} Dec 03 11:56:58 crc kubenswrapper[4702]: I1203 11:56:58.403986 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tbm5x" podStartSLOduration=2.8451088220000003 podStartE2EDuration="9.403962541s" podCreationTimestamp="2025-12-03 11:56:49 +0000 UTC" firstStartedPulling="2025-12-03 11:56:51.27965945 +0000 UTC m=+3195.115587914" lastFinishedPulling="2025-12-03 11:56:57.838513179 +0000 UTC m=+3201.674441633" observedRunningTime="2025-12-03 11:56:58.392028991 +0000 UTC m=+3202.227957485" watchObservedRunningTime="2025-12-03 11:56:58.403962541 +0000 UTC m=+3202.239891005" Dec 03 11:56:59 crc kubenswrapper[4702]: I1203 11:56:59.381130 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:56:59 crc kubenswrapper[4702]: I1203 11:56:59.381504 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:57:00 crc kubenswrapper[4702]: I1203 11:57:00.439961 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tbm5x" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="registry-server" probeResult="failure" output=< Dec 03 11:57:00 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 11:57:00 crc kubenswrapper[4702]: > Dec 03 11:57:03 crc kubenswrapper[4702]: I1203 11:57:03.928196 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:57:03 crc kubenswrapper[4702]: E1203 11:57:03.929563 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:57:09 crc kubenswrapper[4702]: I1203 11:57:09.444015 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:57:09 crc kubenswrapper[4702]: I1203 11:57:09.543254 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:57:09 crc kubenswrapper[4702]: I1203 11:57:09.687495 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbm5x"] Dec 03 11:57:10 crc kubenswrapper[4702]: I1203 11:57:10.540684 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tbm5x" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="registry-server" containerID="cri-o://5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729" gracePeriod=2 Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.247526 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.395680 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvjdj\" (UniqueName: \"kubernetes.io/projected/9c3cdd23-5d62-494a-a075-21087c238711-kube-api-access-rvjdj\") pod \"9c3cdd23-5d62-494a-a075-21087c238711\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.398373 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-utilities\") pod \"9c3cdd23-5d62-494a-a075-21087c238711\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.399817 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-utilities" (OuterVolumeSpecName: "utilities") pod "9c3cdd23-5d62-494a-a075-21087c238711" (UID: "9c3cdd23-5d62-494a-a075-21087c238711"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.400140 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-catalog-content\") pod \"9c3cdd23-5d62-494a-a075-21087c238711\" (UID: \"9c3cdd23-5d62-494a-a075-21087c238711\") " Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.404101 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.405839 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3cdd23-5d62-494a-a075-21087c238711-kube-api-access-rvjdj" (OuterVolumeSpecName: "kube-api-access-rvjdj") pod "9c3cdd23-5d62-494a-a075-21087c238711" (UID: "9c3cdd23-5d62-494a-a075-21087c238711"). InnerVolumeSpecName "kube-api-access-rvjdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.506807 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvjdj\" (UniqueName: \"kubernetes.io/projected/9c3cdd23-5d62-494a-a075-21087c238711-kube-api-access-rvjdj\") on node \"crc\" DevicePath \"\"" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.532554 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c3cdd23-5d62-494a-a075-21087c238711" (UID: "9c3cdd23-5d62-494a-a075-21087c238711"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.555977 4702 generic.go:334] "Generic (PLEG): container finished" podID="9c3cdd23-5d62-494a-a075-21087c238711" containerID="5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729" exitCode=0 Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.556066 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tbm5x" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.556044 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm5x" event={"ID":"9c3cdd23-5d62-494a-a075-21087c238711","Type":"ContainerDied","Data":"5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729"} Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.556242 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tbm5x" event={"ID":"9c3cdd23-5d62-494a-a075-21087c238711","Type":"ContainerDied","Data":"4944ebb02b693e3b0b3fc5b9c09ae84a040110eb721c9fcc21be793802cdf393"} Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.556267 4702 scope.go:117] "RemoveContainer" containerID="5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.599993 4702 scope.go:117] "RemoveContainer" containerID="a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.609327 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c3cdd23-5d62-494a-a075-21087c238711-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.610556 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tbm5x"] Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.622518 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tbm5x"] Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.634106 4702 scope.go:117] "RemoveContainer" containerID="94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.688814 4702 scope.go:117] "RemoveContainer" containerID="5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729" Dec 03 11:57:11 crc kubenswrapper[4702]: E1203 11:57:11.689350 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729\": container with ID starting with 5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729 not found: ID does not exist" containerID="5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.689401 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729"} err="failed to get container status \"5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729\": rpc error: code = NotFound desc = could not find container \"5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729\": container with ID starting with 5a45f90afc9437f82176d063c628100dd4f91637754d0ae27f67c50087ca7729 not found: ID does not exist" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.689446 4702 scope.go:117] "RemoveContainer" containerID="a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9" Dec 03 11:57:11 crc kubenswrapper[4702]: E1203 11:57:11.690051 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9\": container with ID starting with a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9 not found: ID does not exist" containerID="a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.690105 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9"} err="failed to get container status \"a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9\": rpc error: code = NotFound desc = could not find container \"a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9\": container with ID starting with a6d91d95953ac16d7f95bf5a58c5152b898d42e94d3f1497cda2cd616e5cbef9 not found: ID does not exist" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.690140 4702 scope.go:117] "RemoveContainer" containerID="94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46" Dec 03 11:57:11 crc kubenswrapper[4702]: E1203 11:57:11.690503 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46\": container with ID starting with 94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46 not found: ID does not exist" containerID="94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46" Dec 03 11:57:11 crc kubenswrapper[4702]: I1203 11:57:11.690534 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46"} err="failed to get container status \"94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46\": rpc error: code = NotFound desc = could not find container \"94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46\": container with ID starting with 94f7372ce7055a1ecb5b46a6b6f47a7478fa109abc9005478081584b05b75f46 not found: ID does not exist" Dec 03 11:57:12 crc kubenswrapper[4702]: I1203 11:57:12.963219 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3cdd23-5d62-494a-a075-21087c238711" path="/var/lib/kubelet/pods/9c3cdd23-5d62-494a-a075-21087c238711/volumes" Dec 03 11:57:16 crc kubenswrapper[4702]: I1203 11:57:16.929625 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:57:16 crc kubenswrapper[4702]: E1203 11:57:16.931289 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:57:30 crc kubenswrapper[4702]: I1203 11:57:30.929461 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:57:30 crc kubenswrapper[4702]: E1203 11:57:30.930478 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:57:44 crc kubenswrapper[4702]: I1203 11:57:44.937406 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:57:44 crc kubenswrapper[4702]: E1203 11:57:44.938554 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:57:57 crc kubenswrapper[4702]: I1203 11:57:57.928238 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:57:57 crc kubenswrapper[4702]: E1203 11:57:57.930252 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:58:07 crc kubenswrapper[4702]: I1203 11:58:07.459293 4702 generic.go:334] "Generic (PLEG): container finished" podID="4332bde3-4d31-498c-8fd3-d1bc3d9e3794" containerID="166118efb40a7e199d71d54fe4622b69eb6aa799b2f359d3aaf8aaa5a557e54d" exitCode=0 Dec 03 11:58:07 crc kubenswrapper[4702]: I1203 11:58:07.459891 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" event={"ID":"4332bde3-4d31-498c-8fd3-d1bc3d9e3794","Type":"ContainerDied","Data":"166118efb40a7e199d71d54fe4622b69eb6aa799b2f359d3aaf8aaa5a557e54d"} Dec 03 11:58:08 crc kubenswrapper[4702]: I1203 11:58:08.995481 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.021869 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-0\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.022217 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-ssh-key\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.022378 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-inventory\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.022546 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-0\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.023070 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-1\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.023188 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-1\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.023481 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-combined-ca-bundle\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.023643 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-extra-config-0\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.023811 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-265fw\" (UniqueName: \"kubernetes.io/projected/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-kube-api-access-265fw\") pod \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\" (UID: \"4332bde3-4d31-498c-8fd3-d1bc3d9e3794\") " Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.050264 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.054877 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-kube-api-access-265fw" (OuterVolumeSpecName: "kube-api-access-265fw") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "kube-api-access-265fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.063797 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-inventory" (OuterVolumeSpecName: "inventory") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.068729 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.071655 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.080499 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.087002 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.087063 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.095002 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "4332bde3-4d31-498c-8fd3-d1bc3d9e3794" (UID: "4332bde3-4d31-498c-8fd3-d1bc3d9e3794"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.128332 4702 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.128682 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.128784 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.128854 4702 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.128916 4702 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.128979 4702 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.129052 4702 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.129123 4702 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.129200 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-265fw\" (UniqueName: \"kubernetes.io/projected/4332bde3-4d31-498c-8fd3-d1bc3d9e3794-kube-api-access-265fw\") on node \"crc\" DevicePath \"\"" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.490050 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" event={"ID":"4332bde3-4d31-498c-8fd3-d1bc3d9e3794","Type":"ContainerDied","Data":"a794a3621b11d5db0b448a7ee9b756c0ddaa5c9e379aa139ff9f064708d1f6fd"} Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.490156 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a794a3621b11d5db0b448a7ee9b756c0ddaa5c9e379aa139ff9f064708d1f6fd" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.490193 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lftxr" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.609447 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj"] Dec 03 11:58:09 crc kubenswrapper[4702]: E1203 11:58:09.610250 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="registry-server" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.610336 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="registry-server" Dec 03 11:58:09 crc kubenswrapper[4702]: E1203 11:58:09.610425 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="extract-utilities" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.610482 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="extract-utilities" Dec 03 11:58:09 crc kubenswrapper[4702]: E1203 11:58:09.610546 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4332bde3-4d31-498c-8fd3-d1bc3d9e3794" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.610635 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4332bde3-4d31-498c-8fd3-d1bc3d9e3794" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 11:58:09 crc kubenswrapper[4702]: E1203 11:58:09.610700 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="extract-content" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.610754 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="extract-content" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.611097 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4332bde3-4d31-498c-8fd3-d1bc3d9e3794" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.611168 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3cdd23-5d62-494a-a075-21087c238711" containerName="registry-server" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.612156 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.614804 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.615122 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.615306 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.615506 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.626475 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.630355 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj"] Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.652508 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.652783 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.652898 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crcqs\" (UniqueName: \"kubernetes.io/projected/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-kube-api-access-crcqs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.653076 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.653291 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.653393 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.653495 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.755738 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.755840 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.755894 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.755978 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.755999 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.756039 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crcqs\" (UniqueName: \"kubernetes.io/projected/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-kube-api-access-crcqs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.756142 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.761550 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.761749 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.762233 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.762408 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.762582 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.765654 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.775921 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crcqs\" (UniqueName: \"kubernetes.io/projected/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-kube-api-access-crcqs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.929496 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 11:58:09 crc kubenswrapper[4702]: I1203 11:58:09.937823 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:58:09 crc kubenswrapper[4702]: E1203 11:58:09.938602 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:58:10 crc kubenswrapper[4702]: I1203 11:58:10.509792 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj"] Dec 03 11:58:11 crc kubenswrapper[4702]: I1203 11:58:11.517310 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" event={"ID":"660e4478-3f02-41ec-9e7d-4cd6067ec6cf","Type":"ContainerStarted","Data":"21769c4132dd0567a3dcd32d9eac4538d5250cfd939e984753d177e097bb43c5"} Dec 03 11:58:15 crc kubenswrapper[4702]: I1203 11:58:15.573706 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" event={"ID":"660e4478-3f02-41ec-9e7d-4cd6067ec6cf","Type":"ContainerStarted","Data":"9084ceebe252dd7aa4d427a2137ed7ee2350e6d3110784c20b720e411e584e6f"} Dec 03 11:58:15 crc kubenswrapper[4702]: I1203 11:58:15.605899 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" podStartSLOduration=2.642686389 podStartE2EDuration="6.605859155s" podCreationTimestamp="2025-12-03 11:58:09 +0000 UTC" firstStartedPulling="2025-12-03 11:58:10.525072824 +0000 UTC m=+3274.361001288" lastFinishedPulling="2025-12-03 11:58:14.48824559 +0000 UTC m=+3278.324174054" observedRunningTime="2025-12-03 11:58:15.600068569 +0000 UTC m=+3279.435997053" watchObservedRunningTime="2025-12-03 11:58:15.605859155 +0000 UTC m=+3279.441787619" Dec 03 11:58:22 crc kubenswrapper[4702]: I1203 11:58:22.938820 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:58:22 crc kubenswrapper[4702]: E1203 11:58:22.941031 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:58:36 crc kubenswrapper[4702]: I1203 11:58:36.939912 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:58:36 crc kubenswrapper[4702]: E1203 11:58:36.940791 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:58:48 crc kubenswrapper[4702]: I1203 11:58:48.929159 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:58:48 crc kubenswrapper[4702]: E1203 11:58:48.929951 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:59:00 crc kubenswrapper[4702]: I1203 11:59:00.928188 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:59:00 crc kubenswrapper[4702]: E1203 11:59:00.929014 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:59:14 crc kubenswrapper[4702]: I1203 11:59:14.929014 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:59:14 crc kubenswrapper[4702]: E1203 11:59:14.930156 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:59:30 crc kubenswrapper[4702]: I1203 11:59:30.021681 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:59:30 crc kubenswrapper[4702]: E1203 11:59:30.023235 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:59:43 crc kubenswrapper[4702]: I1203 11:59:43.928468 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:59:43 crc kubenswrapper[4702]: E1203 11:59:43.929374 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 11:59:57 crc kubenswrapper[4702]: I1203 11:59:57.929434 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 11:59:57 crc kubenswrapper[4702]: E1203 11:59:57.930494 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.154550 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx"] Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.157110 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.159268 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbngm\" (UniqueName: \"kubernetes.io/projected/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-kube-api-access-gbngm\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.159325 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-config-volume\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.159424 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-secret-volume\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.159627 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.160062 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.168103 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx"] Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.262297 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbngm\" (UniqueName: \"kubernetes.io/projected/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-kube-api-access-gbngm\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.262355 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-config-volume\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.262426 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-secret-volume\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.263633 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-config-volume\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.270393 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-secret-volume\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.281843 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbngm\" (UniqueName: \"kubernetes.io/projected/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-kube-api-access-gbngm\") pod \"collect-profiles-29412720-sbcgx\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:00 crc kubenswrapper[4702]: I1203 12:00:00.495547 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:01 crc kubenswrapper[4702]: I1203 12:00:01.022667 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx"] Dec 03 12:00:01 crc kubenswrapper[4702]: I1203 12:00:01.753066 4702 generic.go:334] "Generic (PLEG): container finished" podID="df96ec35-4a73-475a-b3b6-7e08dbfddc4d" containerID="7f02b036b8ef2abf40c6c4b52e1f52a4696891c937740d00ca0f2e23f7cfbb8a" exitCode=0 Dec 03 12:00:01 crc kubenswrapper[4702]: I1203 12:00:01.753223 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" event={"ID":"df96ec35-4a73-475a-b3b6-7e08dbfddc4d","Type":"ContainerDied","Data":"7f02b036b8ef2abf40c6c4b52e1f52a4696891c937740d00ca0f2e23f7cfbb8a"} Dec 03 12:00:01 crc kubenswrapper[4702]: I1203 12:00:01.753411 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" event={"ID":"df96ec35-4a73-475a-b3b6-7e08dbfddc4d","Type":"ContainerStarted","Data":"b20f12f1e305a71c9514cd7c54ad0b264208f6d8a3b875b4f99568830e77a921"} Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.277210 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.368020 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-secret-volume\") pod \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.368187 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-config-volume\") pod \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.368302 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbngm\" (UniqueName: \"kubernetes.io/projected/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-kube-api-access-gbngm\") pod \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\" (UID: \"df96ec35-4a73-475a-b3b6-7e08dbfddc4d\") " Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.369171 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-config-volume" (OuterVolumeSpecName: "config-volume") pod "df96ec35-4a73-475a-b3b6-7e08dbfddc4d" (UID: "df96ec35-4a73-475a-b3b6-7e08dbfddc4d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.377640 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "df96ec35-4a73-475a-b3b6-7e08dbfddc4d" (UID: "df96ec35-4a73-475a-b3b6-7e08dbfddc4d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.377938 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-kube-api-access-gbngm" (OuterVolumeSpecName: "kube-api-access-gbngm") pod "df96ec35-4a73-475a-b3b6-7e08dbfddc4d" (UID: "df96ec35-4a73-475a-b3b6-7e08dbfddc4d"). InnerVolumeSpecName "kube-api-access-gbngm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.471017 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.471066 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbngm\" (UniqueName: \"kubernetes.io/projected/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-kube-api-access-gbngm\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.471083 4702 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df96ec35-4a73-475a-b3b6-7e08dbfddc4d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.784088 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" event={"ID":"df96ec35-4a73-475a-b3b6-7e08dbfddc4d","Type":"ContainerDied","Data":"b20f12f1e305a71c9514cd7c54ad0b264208f6d8a3b875b4f99568830e77a921"} Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.784160 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx" Dec 03 12:00:03 crc kubenswrapper[4702]: I1203 12:00:03.784179 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b20f12f1e305a71c9514cd7c54ad0b264208f6d8a3b875b4f99568830e77a921" Dec 03 12:00:04 crc kubenswrapper[4702]: I1203 12:00:04.362606 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6"] Dec 03 12:00:04 crc kubenswrapper[4702]: I1203 12:00:04.375937 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412675-hzxt6"] Dec 03 12:00:04 crc kubenswrapper[4702]: I1203 12:00:04.981095 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da6b4602-6381-43c2-bc7b-df1f2ade0083" path="/var/lib/kubelet/pods/da6b4602-6381-43c2-bc7b-df1f2ade0083/volumes" Dec 03 12:00:05 crc kubenswrapper[4702]: I1203 12:00:05.291198 4702 scope.go:117] "RemoveContainer" containerID="7d50aa7b81f8a3da6bb3a35c53f74ba623b7253fe88bdba60e529c3182452457" Dec 03 12:00:12 crc kubenswrapper[4702]: I1203 12:00:12.929602 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 12:00:12 crc kubenswrapper[4702]: E1203 12:00:12.930533 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:00:27 crc kubenswrapper[4702]: I1203 12:00:27.928470 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 12:00:27 crc kubenswrapper[4702]: E1203 12:00:27.929320 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:00:42 crc kubenswrapper[4702]: I1203 12:00:42.935610 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 12:00:42 crc kubenswrapper[4702]: E1203 12:00:42.938414 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:00:46 crc kubenswrapper[4702]: I1203 12:00:46.544784 4702 generic.go:334] "Generic (PLEG): container finished" podID="660e4478-3f02-41ec-9e7d-4cd6067ec6cf" containerID="9084ceebe252dd7aa4d427a2137ed7ee2350e6d3110784c20b720e411e584e6f" exitCode=0 Dec 03 12:00:46 crc kubenswrapper[4702]: I1203 12:00:46.544873 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" event={"ID":"660e4478-3f02-41ec-9e7d-4cd6067ec6cf","Type":"ContainerDied","Data":"9084ceebe252dd7aa4d427a2137ed7ee2350e6d3110784c20b720e411e584e6f"} Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.349727 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.390341 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-inventory\") pod \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.390403 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crcqs\" (UniqueName: \"kubernetes.io/projected/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-kube-api-access-crcqs\") pod \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.390480 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key\") pod \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.390526 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-0\") pod \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.390834 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-2\") pod \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.390898 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-telemetry-combined-ca-bundle\") pod \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.390997 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-1\") pod \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.399682 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "660e4478-3f02-41ec-9e7d-4cd6067ec6cf" (UID: "660e4478-3f02-41ec-9e7d-4cd6067ec6cf"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.400823 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-kube-api-access-crcqs" (OuterVolumeSpecName: "kube-api-access-crcqs") pod "660e4478-3f02-41ec-9e7d-4cd6067ec6cf" (UID: "660e4478-3f02-41ec-9e7d-4cd6067ec6cf"). InnerVolumeSpecName "kube-api-access-crcqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.440049 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "660e4478-3f02-41ec-9e7d-4cd6067ec6cf" (UID: "660e4478-3f02-41ec-9e7d-4cd6067ec6cf"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.440174 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "660e4478-3f02-41ec-9e7d-4cd6067ec6cf" (UID: "660e4478-3f02-41ec-9e7d-4cd6067ec6cf"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.444637 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "660e4478-3f02-41ec-9e7d-4cd6067ec6cf" (UID: "660e4478-3f02-41ec-9e7d-4cd6067ec6cf"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:00:48 crc kubenswrapper[4702]: E1203 12:00:48.459099 4702 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key podName:660e4478-3f02-41ec-9e7d-4cd6067ec6cf nodeName:}" failed. No retries permitted until 2025-12-03 12:00:48.959052888 +0000 UTC m=+3432.794981372 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key" (UniqueName: "kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key") pod "660e4478-3f02-41ec-9e7d-4cd6067ec6cf" (UID: "660e4478-3f02-41ec-9e7d-4cd6067ec6cf") : error deleting /var/lib/kubelet/pods/660e4478-3f02-41ec-9e7d-4cd6067ec6cf/volume-subpaths: remove /var/lib/kubelet/pods/660e4478-3f02-41ec-9e7d-4cd6067ec6cf/volume-subpaths: no such file or directory Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.461161 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-inventory" (OuterVolumeSpecName: "inventory") pod "660e4478-3f02-41ec-9e7d-4cd6067ec6cf" (UID: "660e4478-3f02-41ec-9e7d-4cd6067ec6cf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.495387 4702 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.495441 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.495471 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.495496 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crcqs\" (UniqueName: \"kubernetes.io/projected/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-kube-api-access-crcqs\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.495518 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.495536 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.578351 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" event={"ID":"660e4478-3f02-41ec-9e7d-4cd6067ec6cf","Type":"ContainerDied","Data":"21769c4132dd0567a3dcd32d9eac4538d5250cfd939e984753d177e097bb43c5"} Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.578422 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.578428 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21769c4132dd0567a3dcd32d9eac4538d5250cfd939e984753d177e097bb43c5" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.712431 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm"] Dec 03 12:00:48 crc kubenswrapper[4702]: E1203 12:00:48.713215 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="660e4478-3f02-41ec-9e7d-4cd6067ec6cf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.713248 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="660e4478-3f02-41ec-9e7d-4cd6067ec6cf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 12:00:48 crc kubenswrapper[4702]: E1203 12:00:48.713281 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df96ec35-4a73-475a-b3b6-7e08dbfddc4d" containerName="collect-profiles" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.713289 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="df96ec35-4a73-475a-b3b6-7e08dbfddc4d" containerName="collect-profiles" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.713668 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="df96ec35-4a73-475a-b3b6-7e08dbfddc4d" containerName="collect-profiles" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.713712 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="660e4478-3f02-41ec-9e7d-4cd6067ec6cf" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.714996 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.722718 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.745538 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm"] Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.807155 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.807312 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvj2s\" (UniqueName: \"kubernetes.io/projected/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-kube-api-access-tvj2s\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.807351 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.807508 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.807709 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.807969 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.808188 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.910926 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvj2s\" (UniqueName: \"kubernetes.io/projected/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-kube-api-access-tvj2s\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.910986 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.911042 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.911124 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.911178 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.911301 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.911357 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.916283 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.918667 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.919264 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ssh-key\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.919428 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.920174 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.927262 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:48 crc kubenswrapper[4702]: I1203 12:00:48.932511 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvj2s\" (UniqueName: \"kubernetes.io/projected/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-kube-api-access-tvj2s\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:49 crc kubenswrapper[4702]: I1203 12:00:49.013171 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key\") pod \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\" (UID: \"660e4478-3f02-41ec-9e7d-4cd6067ec6cf\") " Dec 03 12:00:49 crc kubenswrapper[4702]: I1203 12:00:49.019782 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "660e4478-3f02-41ec-9e7d-4cd6067ec6cf" (UID: "660e4478-3f02-41ec-9e7d-4cd6067ec6cf"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:00:49 crc kubenswrapper[4702]: I1203 12:00:49.049491 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:00:49 crc kubenswrapper[4702]: I1203 12:00:49.116577 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/660e4478-3f02-41ec-9e7d-4cd6067ec6cf-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:00:49 crc kubenswrapper[4702]: I1203 12:00:49.700055 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm"] Dec 03 12:00:49 crc kubenswrapper[4702]: I1203 12:00:49.709157 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:00:50 crc kubenswrapper[4702]: I1203 12:00:50.612487 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" event={"ID":"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386","Type":"ContainerStarted","Data":"19c366272cbcecb89df1d9cbabdf18cbde73206ee2b0731222632abfde3aa4e5"} Dec 03 12:00:53 crc kubenswrapper[4702]: I1203 12:00:53.928385 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 12:00:53 crc kubenswrapper[4702]: E1203 12:00:53.929199 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:00:54 crc kubenswrapper[4702]: I1203 12:00:54.657724 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" event={"ID":"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386","Type":"ContainerStarted","Data":"6ce6fa0bae740c4b3b466d63ac61092ed8b90f05a6b171cc2afc769908aafb7b"} Dec 03 12:00:54 crc kubenswrapper[4702]: I1203 12:00:54.691360 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" podStartSLOduration=2.961335956 podStartE2EDuration="6.69130353s" podCreationTimestamp="2025-12-03 12:00:48 +0000 UTC" firstStartedPulling="2025-12-03 12:00:49.708819862 +0000 UTC m=+3433.544748316" lastFinishedPulling="2025-12-03 12:00:53.438787426 +0000 UTC m=+3437.274715890" observedRunningTime="2025-12-03 12:00:54.683090026 +0000 UTC m=+3438.519018490" watchObservedRunningTime="2025-12-03 12:00:54.69130353 +0000 UTC m=+3438.527231994" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.135220 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412721-m9zdz"] Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.137647 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.173981 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412721-m9zdz"] Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.199690 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fbcr\" (UniqueName: \"kubernetes.io/projected/f925247e-4f37-4a2d-9873-0b68308d6e3c-kube-api-access-9fbcr\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.200002 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-fernet-keys\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.200285 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-combined-ca-bundle\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.200357 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-config-data\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.301863 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fbcr\" (UniqueName: \"kubernetes.io/projected/f925247e-4f37-4a2d-9873-0b68308d6e3c-kube-api-access-9fbcr\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.301925 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-fernet-keys\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.302019 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-combined-ca-bundle\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.302044 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-config-data\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.312133 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-fernet-keys\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.317422 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-config-data\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.318523 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-combined-ca-bundle\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.332461 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fbcr\" (UniqueName: \"kubernetes.io/projected/f925247e-4f37-4a2d-9873-0b68308d6e3c-kube-api-access-9fbcr\") pod \"keystone-cron-29412721-m9zdz\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:00 crc kubenswrapper[4702]: I1203 12:01:00.476636 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:01 crc kubenswrapper[4702]: I1203 12:01:01.077240 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412721-m9zdz"] Dec 03 12:01:01 crc kubenswrapper[4702]: I1203 12:01:01.794054 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412721-m9zdz" event={"ID":"f925247e-4f37-4a2d-9873-0b68308d6e3c","Type":"ContainerStarted","Data":"777f9c3f32d5d53c02a2968c3685834d89a8ea5ea25ea6d9ea214ab3beaca53c"} Dec 03 12:01:01 crc kubenswrapper[4702]: I1203 12:01:01.794627 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412721-m9zdz" event={"ID":"f925247e-4f37-4a2d-9873-0b68308d6e3c","Type":"ContainerStarted","Data":"39e814d79480bd317faccaee0cc2ecef66e5c12a73dc26a3e1c6bc7ecbe27236"} Dec 03 12:01:01 crc kubenswrapper[4702]: I1203 12:01:01.816403 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412721-m9zdz" podStartSLOduration=1.816380025 podStartE2EDuration="1.816380025s" podCreationTimestamp="2025-12-03 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:01:01.812577866 +0000 UTC m=+3445.648506330" watchObservedRunningTime="2025-12-03 12:01:01.816380025 +0000 UTC m=+3445.652308489" Dec 03 12:01:06 crc kubenswrapper[4702]: I1203 12:01:06.873719 4702 generic.go:334] "Generic (PLEG): container finished" podID="f925247e-4f37-4a2d-9873-0b68308d6e3c" containerID="777f9c3f32d5d53c02a2968c3685834d89a8ea5ea25ea6d9ea214ab3beaca53c" exitCode=0 Dec 03 12:01:06 crc kubenswrapper[4702]: I1203 12:01:06.873822 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412721-m9zdz" event={"ID":"f925247e-4f37-4a2d-9873-0b68308d6e3c","Type":"ContainerDied","Data":"777f9c3f32d5d53c02a2968c3685834d89a8ea5ea25ea6d9ea214ab3beaca53c"} Dec 03 12:01:06 crc kubenswrapper[4702]: I1203 12:01:06.928580 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 12:01:06 crc kubenswrapper[4702]: E1203 12:01:06.929071 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.322102 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.481403 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-config-data\") pod \"f925247e-4f37-4a2d-9873-0b68308d6e3c\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.481577 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-combined-ca-bundle\") pod \"f925247e-4f37-4a2d-9873-0b68308d6e3c\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.482461 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fbcr\" (UniqueName: \"kubernetes.io/projected/f925247e-4f37-4a2d-9873-0b68308d6e3c-kube-api-access-9fbcr\") pod \"f925247e-4f37-4a2d-9873-0b68308d6e3c\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.482651 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-fernet-keys\") pod \"f925247e-4f37-4a2d-9873-0b68308d6e3c\" (UID: \"f925247e-4f37-4a2d-9873-0b68308d6e3c\") " Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.488106 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f925247e-4f37-4a2d-9873-0b68308d6e3c" (UID: "f925247e-4f37-4a2d-9873-0b68308d6e3c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.488650 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f925247e-4f37-4a2d-9873-0b68308d6e3c-kube-api-access-9fbcr" (OuterVolumeSpecName: "kube-api-access-9fbcr") pod "f925247e-4f37-4a2d-9873-0b68308d6e3c" (UID: "f925247e-4f37-4a2d-9873-0b68308d6e3c"). InnerVolumeSpecName "kube-api-access-9fbcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.526018 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f925247e-4f37-4a2d-9873-0b68308d6e3c" (UID: "f925247e-4f37-4a2d-9873-0b68308d6e3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.686735 4702 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.686803 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.686823 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fbcr\" (UniqueName: \"kubernetes.io/projected/f925247e-4f37-4a2d-9873-0b68308d6e3c-kube-api-access-9fbcr\") on node \"crc\" DevicePath \"\"" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.705090 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-config-data" (OuterVolumeSpecName: "config-data") pod "f925247e-4f37-4a2d-9873-0b68308d6e3c" (UID: "f925247e-4f37-4a2d-9873-0b68308d6e3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.788966 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f925247e-4f37-4a2d-9873-0b68308d6e3c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.910116 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412721-m9zdz" event={"ID":"f925247e-4f37-4a2d-9873-0b68308d6e3c","Type":"ContainerDied","Data":"39e814d79480bd317faccaee0cc2ecef66e5c12a73dc26a3e1c6bc7ecbe27236"} Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.910213 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e814d79480bd317faccaee0cc2ecef66e5c12a73dc26a3e1c6bc7ecbe27236" Dec 03 12:01:08 crc kubenswrapper[4702]: I1203 12:01:08.910343 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412721-m9zdz" Dec 03 12:01:18 crc kubenswrapper[4702]: I1203 12:01:18.928324 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 12:01:18 crc kubenswrapper[4702]: E1203 12:01:18.929143 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:01:31 crc kubenswrapper[4702]: I1203 12:01:31.007673 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 12:01:32 crc kubenswrapper[4702]: I1203 12:01:32.286520 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"9e338b8fada393470fcaf18444666db86bba061698bf1ca05450134fe66b5a3f"} Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.395482 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lznqq"] Dec 03 12:01:42 crc kubenswrapper[4702]: E1203 12:01:42.396742 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f925247e-4f37-4a2d-9873-0b68308d6e3c" containerName="keystone-cron" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.396789 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="f925247e-4f37-4a2d-9873-0b68308d6e3c" containerName="keystone-cron" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.397152 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="f925247e-4f37-4a2d-9873-0b68308d6e3c" containerName="keystone-cron" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.399656 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.416023 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lznqq"] Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.438228 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-utilities\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.438473 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-catalog-content\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.438641 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf6vb\" (UniqueName: \"kubernetes.io/projected/71df7a62-6ca2-4593-a958-74e48a83b47c-kube-api-access-jf6vb\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.540074 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf6vb\" (UniqueName: \"kubernetes.io/projected/71df7a62-6ca2-4593-a958-74e48a83b47c-kube-api-access-jf6vb\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.540263 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-utilities\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.540356 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-catalog-content\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.541069 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-utilities\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.541089 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-catalog-content\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.571909 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf6vb\" (UniqueName: \"kubernetes.io/projected/71df7a62-6ca2-4593-a958-74e48a83b47c-kube-api-access-jf6vb\") pod \"community-operators-lznqq\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:42 crc kubenswrapper[4702]: I1203 12:01:42.740853 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:01:43 crc kubenswrapper[4702]: I1203 12:01:43.603373 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lznqq"] Dec 03 12:01:44 crc kubenswrapper[4702]: I1203 12:01:44.485869 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lznqq" event={"ID":"71df7a62-6ca2-4593-a958-74e48a83b47c","Type":"ContainerStarted","Data":"c08a1aad5c42b1f07c512541b3d02d03dcbc6f96f2fef26cd983bf7ac4d6c1f9"} Dec 03 12:01:45 crc kubenswrapper[4702]: I1203 12:01:45.499593 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lznqq" event={"ID":"71df7a62-6ca2-4593-a958-74e48a83b47c","Type":"ContainerStarted","Data":"5dd42eb6047867d2daa1c5dcbd1542690b5247d7911e866fd1caecf2e5e203c3"} Dec 03 12:01:46 crc kubenswrapper[4702]: I1203 12:01:46.656792 4702 generic.go:334] "Generic (PLEG): container finished" podID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerID="5dd42eb6047867d2daa1c5dcbd1542690b5247d7911e866fd1caecf2e5e203c3" exitCode=0 Dec 03 12:01:46 crc kubenswrapper[4702]: I1203 12:01:46.656851 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lznqq" event={"ID":"71df7a62-6ca2-4593-a958-74e48a83b47c","Type":"ContainerDied","Data":"5dd42eb6047867d2daa1c5dcbd1542690b5247d7911e866fd1caecf2e5e203c3"} Dec 03 12:01:53 crc kubenswrapper[4702]: I1203 12:01:53.998986 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lznqq" event={"ID":"71df7a62-6ca2-4593-a958-74e48a83b47c","Type":"ContainerStarted","Data":"e41cace7e5007b00f7a5daac67b18510a2521d457ee1bfc9edb3218963f5dbc3"} Dec 03 12:02:03 crc kubenswrapper[4702]: E1203 12:02:03.123541 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71df7a62_6ca2_4593_a958_74e48a83b47c.slice/crio-conmon-e41cace7e5007b00f7a5daac67b18510a2521d457ee1bfc9edb3218963f5dbc3.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:02:03 crc kubenswrapper[4702]: I1203 12:02:03.144461 4702 generic.go:334] "Generic (PLEG): container finished" podID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerID="e41cace7e5007b00f7a5daac67b18510a2521d457ee1bfc9edb3218963f5dbc3" exitCode=0 Dec 03 12:02:03 crc kubenswrapper[4702]: I1203 12:02:03.144525 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lznqq" event={"ID":"71df7a62-6ca2-4593-a958-74e48a83b47c","Type":"ContainerDied","Data":"e41cace7e5007b00f7a5daac67b18510a2521d457ee1bfc9edb3218963f5dbc3"} Dec 03 12:02:06 crc kubenswrapper[4702]: I1203 12:02:06.233299 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lznqq" event={"ID":"71df7a62-6ca2-4593-a958-74e48a83b47c","Type":"ContainerStarted","Data":"f2edffd547ad62045aa743468741d71fe084429169d4d21d40e90f26447eb80f"} Dec 03 12:02:06 crc kubenswrapper[4702]: I1203 12:02:06.272184 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lznqq" podStartSLOduration=6.692818088 podStartE2EDuration="24.272144685s" podCreationTimestamp="2025-12-03 12:01:42 +0000 UTC" firstStartedPulling="2025-12-03 12:01:47.673045854 +0000 UTC m=+3491.508974318" lastFinishedPulling="2025-12-03 12:02:05.252372451 +0000 UTC m=+3509.088300915" observedRunningTime="2025-12-03 12:02:06.258496306 +0000 UTC m=+3510.094424790" watchObservedRunningTime="2025-12-03 12:02:06.272144685 +0000 UTC m=+3510.108073159" Dec 03 12:02:12 crc kubenswrapper[4702]: I1203 12:02:12.742370 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:02:12 crc kubenswrapper[4702]: I1203 12:02:12.743029 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:02:12 crc kubenswrapper[4702]: I1203 12:02:12.814132 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:02:13 crc kubenswrapper[4702]: I1203 12:02:13.655217 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:02:13 crc kubenswrapper[4702]: I1203 12:02:13.786368 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lznqq"] Dec 03 12:02:15 crc kubenswrapper[4702]: I1203 12:02:15.621950 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lznqq" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerName="registry-server" containerID="cri-o://f2edffd547ad62045aa743468741d71fe084429169d4d21d40e90f26447eb80f" gracePeriod=2 Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.636360 4702 generic.go:334] "Generic (PLEG): container finished" podID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerID="f2edffd547ad62045aa743468741d71fe084429169d4d21d40e90f26447eb80f" exitCode=0 Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.636454 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lznqq" event={"ID":"71df7a62-6ca2-4593-a958-74e48a83b47c","Type":"ContainerDied","Data":"f2edffd547ad62045aa743468741d71fe084429169d4d21d40e90f26447eb80f"} Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.636677 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lznqq" event={"ID":"71df7a62-6ca2-4593-a958-74e48a83b47c","Type":"ContainerDied","Data":"c08a1aad5c42b1f07c512541b3d02d03dcbc6f96f2fef26cd983bf7ac4d6c1f9"} Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.636693 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c08a1aad5c42b1f07c512541b3d02d03dcbc6f96f2fef26cd983bf7ac4d6c1f9" Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.745067 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.784792 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-utilities\") pod \"71df7a62-6ca2-4593-a958-74e48a83b47c\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.784892 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-catalog-content\") pod \"71df7a62-6ca2-4593-a958-74e48a83b47c\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.785053 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf6vb\" (UniqueName: \"kubernetes.io/projected/71df7a62-6ca2-4593-a958-74e48a83b47c-kube-api-access-jf6vb\") pod \"71df7a62-6ca2-4593-a958-74e48a83b47c\" (UID: \"71df7a62-6ca2-4593-a958-74e48a83b47c\") " Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.785831 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-utilities" (OuterVolumeSpecName: "utilities") pod "71df7a62-6ca2-4593-a958-74e48a83b47c" (UID: "71df7a62-6ca2-4593-a958-74e48a83b47c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.794375 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71df7a62-6ca2-4593-a958-74e48a83b47c-kube-api-access-jf6vb" (OuterVolumeSpecName: "kube-api-access-jf6vb") pod "71df7a62-6ca2-4593-a958-74e48a83b47c" (UID: "71df7a62-6ca2-4593-a958-74e48a83b47c"). InnerVolumeSpecName "kube-api-access-jf6vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.857064 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71df7a62-6ca2-4593-a958-74e48a83b47c" (UID: "71df7a62-6ca2-4593-a958-74e48a83b47c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.888327 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf6vb\" (UniqueName: \"kubernetes.io/projected/71df7a62-6ca2-4593-a958-74e48a83b47c-kube-api-access-jf6vb\") on node \"crc\" DevicePath \"\"" Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.888380 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:02:16 crc kubenswrapper[4702]: I1203 12:02:16.888395 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71df7a62-6ca2-4593-a958-74e48a83b47c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:02:17 crc kubenswrapper[4702]: I1203 12:02:17.648111 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lznqq" Dec 03 12:02:17 crc kubenswrapper[4702]: I1203 12:02:17.681270 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lznqq"] Dec 03 12:02:17 crc kubenswrapper[4702]: I1203 12:02:17.701219 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lznqq"] Dec 03 12:02:18 crc kubenswrapper[4702]: I1203 12:02:18.947282 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" path="/var/lib/kubelet/pods/71df7a62-6ca2-4593-a958-74e48a83b47c/volumes" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.293331 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t4r76"] Dec 03 12:02:59 crc kubenswrapper[4702]: E1203 12:02:59.294697 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerName="extract-utilities" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.294719 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerName="extract-utilities" Dec 03 12:02:59 crc kubenswrapper[4702]: E1203 12:02:59.294749 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerName="extract-content" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.294778 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerName="extract-content" Dec 03 12:02:59 crc kubenswrapper[4702]: E1203 12:02:59.294797 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerName="registry-server" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.294804 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerName="registry-server" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.295134 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="71df7a62-6ca2-4593-a958-74e48a83b47c" containerName="registry-server" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.297692 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.309778 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4r76"] Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.402278 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-utilities\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.402881 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh5qm\" (UniqueName: \"kubernetes.io/projected/96d8b34a-3ab5-424d-8836-f4ad4290d38b-kube-api-access-vh5qm\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.403266 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-catalog-content\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.506241 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-utilities\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.506379 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh5qm\" (UniqueName: \"kubernetes.io/projected/96d8b34a-3ab5-424d-8836-f4ad4290d38b-kube-api-access-vh5qm\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.506482 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-catalog-content\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.508049 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-catalog-content\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.509270 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-utilities\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.539699 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh5qm\" (UniqueName: \"kubernetes.io/projected/96d8b34a-3ab5-424d-8836-f4ad4290d38b-kube-api-access-vh5qm\") pod \"certified-operators-t4r76\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:02:59 crc kubenswrapper[4702]: I1203 12:02:59.629081 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:03:00 crc kubenswrapper[4702]: I1203 12:03:00.220655 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t4r76"] Dec 03 12:03:00 crc kubenswrapper[4702]: W1203 12:03:00.225647 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96d8b34a_3ab5_424d_8836_f4ad4290d38b.slice/crio-798e8b129a2b1824ffc2258d5179fc49e3cbb7a981dc88384fde9090d94058ac WatchSource:0}: Error finding container 798e8b129a2b1824ffc2258d5179fc49e3cbb7a981dc88384fde9090d94058ac: Status 404 returned error can't find the container with id 798e8b129a2b1824ffc2258d5179fc49e3cbb7a981dc88384fde9090d94058ac Dec 03 12:03:00 crc kubenswrapper[4702]: I1203 12:03:00.553734 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4r76" event={"ID":"96d8b34a-3ab5-424d-8836-f4ad4290d38b","Type":"ContainerStarted","Data":"798e8b129a2b1824ffc2258d5179fc49e3cbb7a981dc88384fde9090d94058ac"} Dec 03 12:03:01 crc kubenswrapper[4702]: I1203 12:03:01.570061 4702 generic.go:334] "Generic (PLEG): container finished" podID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerID="b15694a26224934029556d1742771bf064589fe153145d7005377afe6fe56f91" exitCode=0 Dec 03 12:03:01 crc kubenswrapper[4702]: I1203 12:03:01.570333 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4r76" event={"ID":"96d8b34a-3ab5-424d-8836-f4ad4290d38b","Type":"ContainerDied","Data":"b15694a26224934029556d1742771bf064589fe153145d7005377afe6fe56f91"} Dec 03 12:03:08 crc kubenswrapper[4702]: I1203 12:03:08.648099 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4r76" event={"ID":"96d8b34a-3ab5-424d-8836-f4ad4290d38b","Type":"ContainerStarted","Data":"5feb03eabf3edcb70e90fc4e48863823d055b893cd3bffd1e9ee5b181acc131b"} Dec 03 12:03:08 crc kubenswrapper[4702]: I1203 12:03:08.650418 4702 generic.go:334] "Generic (PLEG): container finished" podID="87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" containerID="6ce6fa0bae740c4b3b466d63ac61092ed8b90f05a6b171cc2afc769908aafb7b" exitCode=0 Dec 03 12:03:08 crc kubenswrapper[4702]: I1203 12:03:08.650458 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" event={"ID":"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386","Type":"ContainerDied","Data":"6ce6fa0bae740c4b3b466d63ac61092ed8b90f05a6b171cc2afc769908aafb7b"} Dec 03 12:03:09 crc kubenswrapper[4702]: I1203 12:03:09.664991 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4r76" event={"ID":"96d8b34a-3ab5-424d-8836-f4ad4290d38b","Type":"ContainerDied","Data":"5feb03eabf3edcb70e90fc4e48863823d055b893cd3bffd1e9ee5b181acc131b"} Dec 03 12:03:09 crc kubenswrapper[4702]: I1203 12:03:09.664838 4702 generic.go:334] "Generic (PLEG): container finished" podID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerID="5feb03eabf3edcb70e90fc4e48863823d055b893cd3bffd1e9ee5b181acc131b" exitCode=0 Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.178795 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.310555 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-inventory\") pod \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.310652 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-0\") pod \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.310725 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ssh-key\") pod \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.310811 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-2\") pod \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.310955 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-telemetry-power-monitoring-combined-ca-bundle\") pod \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.311082 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvj2s\" (UniqueName: \"kubernetes.io/projected/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-kube-api-access-tvj2s\") pod \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.311246 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-1\") pod \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\" (UID: \"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386\") " Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.318436 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-kube-api-access-tvj2s" (OuterVolumeSpecName: "kube-api-access-tvj2s") pod "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" (UID: "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386"). InnerVolumeSpecName "kube-api-access-tvj2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.319482 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" (UID: "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.344007 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" (UID: "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.351079 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" (UID: "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.351374 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-inventory" (OuterVolumeSpecName: "inventory") pod "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" (UID: "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.356401 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" (UID: "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.364876 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" (UID: "87bf58e6-ad4c-4d7a-94e1-0c1e2715a386"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.414847 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvj2s\" (UniqueName: \"kubernetes.io/projected/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-kube-api-access-tvj2s\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.414896 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.414908 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.414920 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.414929 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.414939 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.414949 4702 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87bf58e6-ad4c-4d7a-94e1-0c1e2715a386-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.680075 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" event={"ID":"87bf58e6-ad4c-4d7a-94e1-0c1e2715a386","Type":"ContainerDied","Data":"19c366272cbcecb89df1d9cbabdf18cbde73206ee2b0731222632abfde3aa4e5"} Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.680121 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19c366272cbcecb89df1d9cbabdf18cbde73206ee2b0731222632abfde3aa4e5" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.680137 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.846489 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf"] Dec 03 12:03:10 crc kubenswrapper[4702]: E1203 12:03:10.847582 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.847604 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.847901 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="87bf58e6-ad4c-4d7a-94e1-0c1e2715a386" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.849399 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.852897 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.853175 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.853250 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.853382 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.854702 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-zmjkp" Dec 03 12:03:10 crc kubenswrapper[4702]: I1203 12:03:10.865032 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf"] Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.045712 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.046398 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlzrs\" (UniqueName: \"kubernetes.io/projected/240da10f-8cde-4000-a815-93bdeeb2af78-kube-api-access-mlzrs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.046582 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.046703 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.047057 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.150517 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.150652 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.150795 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlzrs\" (UniqueName: \"kubernetes.io/projected/240da10f-8cde-4000-a815-93bdeeb2af78-kube-api-access-mlzrs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.150854 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.150887 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.156788 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-ssh-key\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.156789 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.159060 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.160009 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.170925 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlzrs\" (UniqueName: \"kubernetes.io/projected/240da10f-8cde-4000-a815-93bdeeb2af78-kube-api-access-mlzrs\") pod \"logging-edpm-deployment-openstack-edpm-ipam-llqpf\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.181886 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:11 crc kubenswrapper[4702]: I1203 12:03:11.885705 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf"] Dec 03 12:03:11 crc kubenswrapper[4702]: W1203 12:03:11.892991 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod240da10f_8cde_4000_a815_93bdeeb2af78.slice/crio-9e25f003232fd5aea5601d0312d3e909c3057d5f451d8846327edd84d5dabec0 WatchSource:0}: Error finding container 9e25f003232fd5aea5601d0312d3e909c3057d5f451d8846327edd84d5dabec0: Status 404 returned error can't find the container with id 9e25f003232fd5aea5601d0312d3e909c3057d5f451d8846327edd84d5dabec0 Dec 03 12:03:12 crc kubenswrapper[4702]: I1203 12:03:12.711674 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" event={"ID":"240da10f-8cde-4000-a815-93bdeeb2af78","Type":"ContainerStarted","Data":"9e25f003232fd5aea5601d0312d3e909c3057d5f451d8846327edd84d5dabec0"} Dec 03 12:03:15 crc kubenswrapper[4702]: I1203 12:03:15.756406 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4r76" event={"ID":"96d8b34a-3ab5-424d-8836-f4ad4290d38b","Type":"ContainerStarted","Data":"e607e52aba0d05f04ea8e901f10d9d56738495d002949cd9fdc61648e165bf79"} Dec 03 12:03:15 crc kubenswrapper[4702]: I1203 12:03:15.759363 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" event={"ID":"240da10f-8cde-4000-a815-93bdeeb2af78","Type":"ContainerStarted","Data":"319b68d7334d03ac1a57af972c93b4fa749ff36ab5b63db6f47555c163536edd"} Dec 03 12:03:15 crc kubenswrapper[4702]: I1203 12:03:15.786742 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t4r76" podStartSLOduration=5.643830405 podStartE2EDuration="16.786708026s" podCreationTimestamp="2025-12-03 12:02:59 +0000 UTC" firstStartedPulling="2025-12-03 12:03:01.572990015 +0000 UTC m=+3565.408918479" lastFinishedPulling="2025-12-03 12:03:12.715867636 +0000 UTC m=+3576.551796100" observedRunningTime="2025-12-03 12:03:15.77948615 +0000 UTC m=+3579.615414624" watchObservedRunningTime="2025-12-03 12:03:15.786708026 +0000 UTC m=+3579.622636500" Dec 03 12:03:15 crc kubenswrapper[4702]: I1203 12:03:15.811500 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" podStartSLOduration=2.6114707040000003 podStartE2EDuration="5.811471292s" podCreationTimestamp="2025-12-03 12:03:10 +0000 UTC" firstStartedPulling="2025-12-03 12:03:11.895351315 +0000 UTC m=+3575.731279779" lastFinishedPulling="2025-12-03 12:03:15.095351903 +0000 UTC m=+3578.931280367" observedRunningTime="2025-12-03 12:03:15.802257579 +0000 UTC m=+3579.638186053" watchObservedRunningTime="2025-12-03 12:03:15.811471292 +0000 UTC m=+3579.647399756" Dec 03 12:03:19 crc kubenswrapper[4702]: I1203 12:03:19.629649 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:03:19 crc kubenswrapper[4702]: I1203 12:03:19.630259 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:03:19 crc kubenswrapper[4702]: I1203 12:03:19.682127 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:03:19 crc kubenswrapper[4702]: I1203 12:03:19.861236 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:03:19 crc kubenswrapper[4702]: I1203 12:03:19.931515 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4r76"] Dec 03 12:03:21 crc kubenswrapper[4702]: I1203 12:03:21.838808 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t4r76" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerName="registry-server" containerID="cri-o://e607e52aba0d05f04ea8e901f10d9d56738495d002949cd9fdc61648e165bf79" gracePeriod=2 Dec 03 12:03:23 crc kubenswrapper[4702]: I1203 12:03:23.865022 4702 generic.go:334] "Generic (PLEG): container finished" podID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerID="e607e52aba0d05f04ea8e901f10d9d56738495d002949cd9fdc61648e165bf79" exitCode=0 Dec 03 12:03:23 crc kubenswrapper[4702]: I1203 12:03:23.865123 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4r76" event={"ID":"96d8b34a-3ab5-424d-8836-f4ad4290d38b","Type":"ContainerDied","Data":"e607e52aba0d05f04ea8e901f10d9d56738495d002949cd9fdc61648e165bf79"} Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.613234 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.699078 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-catalog-content\") pod \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.699152 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-utilities\") pod \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.699232 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh5qm\" (UniqueName: \"kubernetes.io/projected/96d8b34a-3ab5-424d-8836-f4ad4290d38b-kube-api-access-vh5qm\") pod \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\" (UID: \"96d8b34a-3ab5-424d-8836-f4ad4290d38b\") " Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.700644 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-utilities" (OuterVolumeSpecName: "utilities") pod "96d8b34a-3ab5-424d-8836-f4ad4290d38b" (UID: "96d8b34a-3ab5-424d-8836-f4ad4290d38b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.708049 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96d8b34a-3ab5-424d-8836-f4ad4290d38b-kube-api-access-vh5qm" (OuterVolumeSpecName: "kube-api-access-vh5qm") pod "96d8b34a-3ab5-424d-8836-f4ad4290d38b" (UID: "96d8b34a-3ab5-424d-8836-f4ad4290d38b"). InnerVolumeSpecName "kube-api-access-vh5qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.790021 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96d8b34a-3ab5-424d-8836-f4ad4290d38b" (UID: "96d8b34a-3ab5-424d-8836-f4ad4290d38b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.801849 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.801887 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96d8b34a-3ab5-424d-8836-f4ad4290d38b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.801898 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh5qm\" (UniqueName: \"kubernetes.io/projected/96d8b34a-3ab5-424d-8836-f4ad4290d38b-kube-api-access-vh5qm\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.880541 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t4r76" event={"ID":"96d8b34a-3ab5-424d-8836-f4ad4290d38b","Type":"ContainerDied","Data":"798e8b129a2b1824ffc2258d5179fc49e3cbb7a981dc88384fde9090d94058ac"} Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.880602 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t4r76" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.880614 4702 scope.go:117] "RemoveContainer" containerID="e607e52aba0d05f04ea8e901f10d9d56738495d002949cd9fdc61648e165bf79" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.903619 4702 scope.go:117] "RemoveContainer" containerID="5feb03eabf3edcb70e90fc4e48863823d055b893cd3bffd1e9ee5b181acc131b" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.927816 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t4r76"] Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.944374 4702 scope.go:117] "RemoveContainer" containerID="b15694a26224934029556d1742771bf064589fe153145d7005377afe6fe56f91" Dec 03 12:03:25 crc kubenswrapper[4702]: I1203 12:03:24.956063 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t4r76"] Dec 03 12:03:26 crc kubenswrapper[4702]: I1203 12:03:26.945409 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" path="/var/lib/kubelet/pods/96d8b34a-3ab5-424d-8836-f4ad4290d38b/volumes" Dec 03 12:03:31 crc kubenswrapper[4702]: I1203 12:03:31.988350 4702 generic.go:334] "Generic (PLEG): container finished" podID="240da10f-8cde-4000-a815-93bdeeb2af78" containerID="319b68d7334d03ac1a57af972c93b4fa749ff36ab5b63db6f47555c163536edd" exitCode=0 Dec 03 12:03:31 crc kubenswrapper[4702]: I1203 12:03:31.988461 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" event={"ID":"240da10f-8cde-4000-a815-93bdeeb2af78","Type":"ContainerDied","Data":"319b68d7334d03ac1a57af972c93b4fa749ff36ab5b63db6f47555c163536edd"} Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.565616 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.762314 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-ssh-key\") pod \"240da10f-8cde-4000-a815-93bdeeb2af78\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.762736 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-inventory\") pod \"240da10f-8cde-4000-a815-93bdeeb2af78\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.762909 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-1\") pod \"240da10f-8cde-4000-a815-93bdeeb2af78\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.763055 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlzrs\" (UniqueName: \"kubernetes.io/projected/240da10f-8cde-4000-a815-93bdeeb2af78-kube-api-access-mlzrs\") pod \"240da10f-8cde-4000-a815-93bdeeb2af78\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.763260 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-0\") pod \"240da10f-8cde-4000-a815-93bdeeb2af78\" (UID: \"240da10f-8cde-4000-a815-93bdeeb2af78\") " Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.771089 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/240da10f-8cde-4000-a815-93bdeeb2af78-kube-api-access-mlzrs" (OuterVolumeSpecName: "kube-api-access-mlzrs") pod "240da10f-8cde-4000-a815-93bdeeb2af78" (UID: "240da10f-8cde-4000-a815-93bdeeb2af78"). InnerVolumeSpecName "kube-api-access-mlzrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.794911 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "240da10f-8cde-4000-a815-93bdeeb2af78" (UID: "240da10f-8cde-4000-a815-93bdeeb2af78"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.795145 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-inventory" (OuterVolumeSpecName: "inventory") pod "240da10f-8cde-4000-a815-93bdeeb2af78" (UID: "240da10f-8cde-4000-a815-93bdeeb2af78"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.809398 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "240da10f-8cde-4000-a815-93bdeeb2af78" (UID: "240da10f-8cde-4000-a815-93bdeeb2af78"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.810965 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "240da10f-8cde-4000-a815-93bdeeb2af78" (UID: "240da10f-8cde-4000-a815-93bdeeb2af78"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.868110 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.868180 4702 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.868197 4702 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.868239 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlzrs\" (UniqueName: \"kubernetes.io/projected/240da10f-8cde-4000-a815-93bdeeb2af78-kube-api-access-mlzrs\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:33 crc kubenswrapper[4702]: I1203 12:03:33.868261 4702 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/240da10f-8cde-4000-a815-93bdeeb2af78-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 12:03:34 crc kubenswrapper[4702]: I1203 12:03:34.013735 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" event={"ID":"240da10f-8cde-4000-a815-93bdeeb2af78","Type":"ContainerDied","Data":"9e25f003232fd5aea5601d0312d3e909c3057d5f451d8846327edd84d5dabec0"} Dec 03 12:03:34 crc kubenswrapper[4702]: I1203 12:03:34.013825 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-llqpf" Dec 03 12:03:34 crc kubenswrapper[4702]: I1203 12:03:34.013839 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e25f003232fd5aea5601d0312d3e909c3057d5f451d8846327edd84d5dabec0" Dec 03 12:03:55 crc kubenswrapper[4702]: I1203 12:03:55.908220 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:03:55 crc kubenswrapper[4702]: I1203 12:03:55.908906 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:04:25 crc kubenswrapper[4702]: I1203 12:04:25.907828 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:04:25 crc kubenswrapper[4702]: I1203 12:04:25.908697 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:04:55 crc kubenswrapper[4702]: I1203 12:04:55.907886 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:04:55 crc kubenswrapper[4702]: I1203 12:04:55.908490 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:04:55 crc kubenswrapper[4702]: I1203 12:04:55.908563 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:04:55 crc kubenswrapper[4702]: I1203 12:04:55.909664 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9e338b8fada393470fcaf18444666db86bba061698bf1ca05450134fe66b5a3f"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:04:55 crc kubenswrapper[4702]: I1203 12:04:55.909747 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://9e338b8fada393470fcaf18444666db86bba061698bf1ca05450134fe66b5a3f" gracePeriod=600 Dec 03 12:04:57 crc kubenswrapper[4702]: I1203 12:04:57.255476 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="9e338b8fada393470fcaf18444666db86bba061698bf1ca05450134fe66b5a3f" exitCode=0 Dec 03 12:04:57 crc kubenswrapper[4702]: I1203 12:04:57.255542 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"9e338b8fada393470fcaf18444666db86bba061698bf1ca05450134fe66b5a3f"} Dec 03 12:04:57 crc kubenswrapper[4702]: I1203 12:04:57.255949 4702 scope.go:117] "RemoveContainer" containerID="c2ae79e68398299afbfd6d80e5a6a5625abc42c94f9cb2b8cb189063df0292b4" Dec 03 12:04:58 crc kubenswrapper[4702]: I1203 12:04:58.279400 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee"} Dec 03 12:05:12 crc kubenswrapper[4702]: E1203 12:05:12.866725 4702 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.176:37578->38.102.83.176:40897: write tcp 38.102.83.176:37578->38.102.83.176:40897: write: connection reset by peer Dec 03 12:06:41 crc kubenswrapper[4702]: I1203 12:06:41.125024 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:07:25 crc kubenswrapper[4702]: I1203 12:07:25.908493 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:07:25 crc kubenswrapper[4702]: I1203 12:07:25.909349 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.723666 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fsqbq"] Dec 03 12:07:49 crc kubenswrapper[4702]: E1203 12:07:49.725188 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerName="extract-content" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.725211 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerName="extract-content" Dec 03 12:07:49 crc kubenswrapper[4702]: E1203 12:07:49.725275 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerName="extract-utilities" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.725286 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerName="extract-utilities" Dec 03 12:07:49 crc kubenswrapper[4702]: E1203 12:07:49.725311 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="240da10f-8cde-4000-a815-93bdeeb2af78" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.725324 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="240da10f-8cde-4000-a815-93bdeeb2af78" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 03 12:07:49 crc kubenswrapper[4702]: E1203 12:07:49.725367 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerName="registry-server" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.725377 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerName="registry-server" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.725739 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="240da10f-8cde-4000-a815-93bdeeb2af78" containerName="logging-edpm-deployment-openstack-edpm-ipam" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.725819 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="96d8b34a-3ab5-424d-8836-f4ad4290d38b" containerName="registry-server" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.728213 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.787163 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fsqbq"] Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.857583 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-utilities\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.858913 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d542g\" (UniqueName: \"kubernetes.io/projected/2b228632-7c76-434a-827b-55c568b762d4-kube-api-access-d542g\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.859124 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-catalog-content\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.961591 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d542g\" (UniqueName: \"kubernetes.io/projected/2b228632-7c76-434a-827b-55c568b762d4-kube-api-access-d542g\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.961657 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-catalog-content\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.961770 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-utilities\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.962515 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-utilities\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.962702 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-catalog-content\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:49 crc kubenswrapper[4702]: I1203 12:07:49.984969 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d542g\" (UniqueName: \"kubernetes.io/projected/2b228632-7c76-434a-827b-55c568b762d4-kube-api-access-d542g\") pod \"redhat-operators-fsqbq\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:50 crc kubenswrapper[4702]: I1203 12:07:50.091541 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.111449 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d5nhp"] Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.115089 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.147047 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5nhp"] Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.216133 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqrt5\" (UniqueName: \"kubernetes.io/projected/a0f169f2-b89d-4c4a-b141-060d9e13386f-kube-api-access-dqrt5\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.216333 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-utilities\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.216369 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-catalog-content\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.319260 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqrt5\" (UniqueName: \"kubernetes.io/projected/a0f169f2-b89d-4c4a-b141-060d9e13386f-kube-api-access-dqrt5\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.319432 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-utilities\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.319472 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-catalog-content\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.320011 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-utilities\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.320117 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-catalog-content\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.348731 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqrt5\" (UniqueName: \"kubernetes.io/projected/a0f169f2-b89d-4c4a-b141-060d9e13386f-kube-api-access-dqrt5\") pod \"redhat-marketplace-d5nhp\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:52 crc kubenswrapper[4702]: I1203 12:07:52.454842 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:07:53 crc kubenswrapper[4702]: I1203 12:07:53.472233 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsqbq" event={"ID":"2b228632-7c76-434a-827b-55c568b762d4","Type":"ContainerStarted","Data":"c5e851041a25138df383c07c4c53a0d688e0ac89226244a3ad528b6f1864e2f0"} Dec 03 12:07:54 crc kubenswrapper[4702]: I1203 12:07:54.512376 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fsqbq"] Dec 03 12:07:54 crc kubenswrapper[4702]: I1203 12:07:54.619620 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5nhp"] Dec 03 12:07:54 crc kubenswrapper[4702]: W1203 12:07:54.619952 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f169f2_b89d_4c4a_b141_060d9e13386f.slice/crio-f36b989d7c1b9e2ab6c32db99fe13635d42f94f74ae6524effa31589568bd9e6 WatchSource:0}: Error finding container f36b989d7c1b9e2ab6c32db99fe13635d42f94f74ae6524effa31589568bd9e6: Status 404 returned error can't find the container with id f36b989d7c1b9e2ab6c32db99fe13635d42f94f74ae6524effa31589568bd9e6 Dec 03 12:07:55 crc kubenswrapper[4702]: I1203 12:07:55.523084 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5nhp" event={"ID":"a0f169f2-b89d-4c4a-b141-060d9e13386f","Type":"ContainerStarted","Data":"f36b989d7c1b9e2ab6c32db99fe13635d42f94f74ae6524effa31589568bd9e6"} Dec 03 12:07:55 crc kubenswrapper[4702]: I1203 12:07:55.908864 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:07:55 crc kubenswrapper[4702]: I1203 12:07:55.908930 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:07:56 crc kubenswrapper[4702]: I1203 12:07:56.404032 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.213:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:07:56 crc kubenswrapper[4702]: I1203 12:07:56.537032 4702 generic.go:334] "Generic (PLEG): container finished" podID="2b228632-7c76-434a-827b-55c568b762d4" containerID="e30f6fa6c5260af9ba34cec61389a67ce9aaa72fe3c899909e602dc357b4cb27" exitCode=0 Dec 03 12:07:56 crc kubenswrapper[4702]: I1203 12:07:56.537111 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsqbq" event={"ID":"2b228632-7c76-434a-827b-55c568b762d4","Type":"ContainerDied","Data":"e30f6fa6c5260af9ba34cec61389a67ce9aaa72fe3c899909e602dc357b4cb27"} Dec 03 12:07:57 crc kubenswrapper[4702]: I1203 12:07:57.550062 4702 generic.go:334] "Generic (PLEG): container finished" podID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerID="179e8e899e74fb08c060aa88185f71cdb0d836e5dcc2830694f132cd81403573" exitCode=0 Dec 03 12:07:57 crc kubenswrapper[4702]: I1203 12:07:57.550202 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5nhp" event={"ID":"a0f169f2-b89d-4c4a-b141-060d9e13386f","Type":"ContainerDied","Data":"179e8e899e74fb08c060aa88185f71cdb0d836e5dcc2830694f132cd81403573"} Dec 03 12:07:57 crc kubenswrapper[4702]: I1203 12:07:57.553517 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:08:03 crc kubenswrapper[4702]: I1203 12:08:03.631142 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsqbq" event={"ID":"2b228632-7c76-434a-827b-55c568b762d4","Type":"ContainerStarted","Data":"8b0fb8b9d8f78b18c46ddd5ee839b81a59ffb2644008ebe0063980aa5053e304"} Dec 03 12:08:05 crc kubenswrapper[4702]: I1203 12:08:05.786698 4702 scope.go:117] "RemoveContainer" containerID="5dd42eb6047867d2daa1c5dcbd1542690b5247d7911e866fd1caecf2e5e203c3" Dec 03 12:08:09 crc kubenswrapper[4702]: I1203 12:08:09.016536 4702 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:08:09 crc kubenswrapper[4702]: I1203 12:08:09.017224 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:08:09 crc kubenswrapper[4702]: I1203 12:08:09.503042 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podUID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:08:11 crc kubenswrapper[4702]: I1203 12:08:11.403985 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.213:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:08:17 crc kubenswrapper[4702]: I1203 12:08:17.530021 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 12:08:18 crc kubenswrapper[4702]: I1203 12:08:18.689699 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" probeResult="failure" output=< Dec 03 12:08:18 crc kubenswrapper[4702]: Unkown error: Expecting value: line 1 column 1 (char 0) Dec 03 12:08:18 crc kubenswrapper[4702]: > Dec 03 12:08:20 crc kubenswrapper[4702]: I1203 12:08:20.863321 4702 generic.go:334] "Generic (PLEG): container finished" podID="2b228632-7c76-434a-827b-55c568b762d4" containerID="8b0fb8b9d8f78b18c46ddd5ee839b81a59ffb2644008ebe0063980aa5053e304" exitCode=0 Dec 03 12:08:20 crc kubenswrapper[4702]: I1203 12:08:20.863495 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsqbq" event={"ID":"2b228632-7c76-434a-827b-55c568b762d4","Type":"ContainerDied","Data":"8b0fb8b9d8f78b18c46ddd5ee839b81a59ffb2644008ebe0063980aa5053e304"} Dec 03 12:08:21 crc kubenswrapper[4702]: I1203 12:08:21.913127 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" probeResult="failure" output=< Dec 03 12:08:21 crc kubenswrapper[4702]: Unkown error: Expecting value: line 1 column 1 (char 0) Dec 03 12:08:21 crc kubenswrapper[4702]: > Dec 03 12:08:21 crc kubenswrapper[4702]: I1203 12:08:21.913480 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Dec 03 12:08:21 crc kubenswrapper[4702]: I1203 12:08:21.914664 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"159ee3d241a321aed89bcfa644d4fed939819784a2a494203acf7b31c0a66347"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Dec 03 12:08:21 crc kubenswrapper[4702]: I1203 12:08:21.914822 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" containerID="cri-o://159ee3d241a321aed89bcfa644d4fed939819784a2a494203acf7b31c0a66347" gracePeriod=30 Dec 03 12:08:23 crc kubenswrapper[4702]: I1203 12:08:23.404090 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.213:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:25.908424 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:25.908919 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:25.908971 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:25.910070 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:25.910142 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" gracePeriod=600 Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:26.215154 4702 scope.go:117] "RemoveContainer" containerID="e41cace7e5007b00f7a5daac67b18510a2521d457ee1bfc9edb3218963f5dbc3" Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:27.949954 4702 generic.go:334] "Generic (PLEG): container finished" podID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerID="159ee3d241a321aed89bcfa644d4fed939819784a2a494203acf7b31c0a66347" exitCode=0 Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:27.950037 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerDied","Data":"159ee3d241a321aed89bcfa644d4fed939819784a2a494203acf7b31c0a66347"} Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:28.669922 4702 scope.go:117] "RemoveContainer" containerID="f2edffd547ad62045aa743468741d71fe084429169d4d21d40e90f26447eb80f" Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:28.976568 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" exitCode=0 Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:28.976869 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee"} Dec 03 12:08:28 crc kubenswrapper[4702]: I1203 12:08:28.977097 4702 scope.go:117] "RemoveContainer" containerID="9e338b8fada393470fcaf18444666db86bba061698bf1ca05450134fe66b5a3f" Dec 03 12:08:29 crc kubenswrapper[4702]: E1203 12:08:29.170592 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:08:30 crc kubenswrapper[4702]: I1203 12:08:30.120548 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:08:30 crc kubenswrapper[4702]: E1203 12:08:30.121592 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:08:30 crc kubenswrapper[4702]: I1203 12:08:30.122787 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5nhp" event={"ID":"a0f169f2-b89d-4c4a-b141-060d9e13386f","Type":"ContainerStarted","Data":"cdad30530c434124eb4cef4dd0cb48baf69e25450e39483bb4ebdbe3d79f1007"} Dec 03 12:08:31 crc kubenswrapper[4702]: I1203 12:08:31.138099 4702 generic.go:334] "Generic (PLEG): container finished" podID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerID="cdad30530c434124eb4cef4dd0cb48baf69e25450e39483bb4ebdbe3d79f1007" exitCode=0 Dec 03 12:08:31 crc kubenswrapper[4702]: I1203 12:08:31.138178 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5nhp" event={"ID":"a0f169f2-b89d-4c4a-b141-060d9e13386f","Type":"ContainerDied","Data":"cdad30530c434124eb4cef4dd0cb48baf69e25450e39483bb4ebdbe3d79f1007"} Dec 03 12:08:38 crc kubenswrapper[4702]: I1203 12:08:38.254942 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsqbq" event={"ID":"2b228632-7c76-434a-827b-55c568b762d4","Type":"ContainerStarted","Data":"41be310109ace09c60b3084a70102863cf9772447abf009cfb2f53b9dea3c58f"} Dec 03 12:08:39 crc kubenswrapper[4702]: I1203 12:08:39.289447 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fsqbq" podStartSLOduration=13.800996559 podStartE2EDuration="50.289413203s" podCreationTimestamp="2025-12-03 12:07:49 +0000 UTC" firstStartedPulling="2025-12-03 12:07:57.553225679 +0000 UTC m=+3861.389154143" lastFinishedPulling="2025-12-03 12:08:34.041642323 +0000 UTC m=+3897.877570787" observedRunningTime="2025-12-03 12:08:39.282874487 +0000 UTC m=+3903.118802971" watchObservedRunningTime="2025-12-03 12:08:39.289413203 +0000 UTC m=+3903.125341667" Dec 03 12:08:40 crc kubenswrapper[4702]: I1203 12:08:40.092256 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:08:40 crc kubenswrapper[4702]: I1203 12:08:40.092631 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:08:40 crc kubenswrapper[4702]: I1203 12:08:40.928786 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:08:40 crc kubenswrapper[4702]: E1203 12:08:40.929333 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:08:41 crc kubenswrapper[4702]: I1203 12:08:41.144339 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fsqbq" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="registry-server" probeResult="failure" output=< Dec 03 12:08:41 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:08:41 crc kubenswrapper[4702]: > Dec 03 12:08:51 crc kubenswrapper[4702]: I1203 12:08:51.553086 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fsqbq" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="registry-server" probeResult="failure" output=< Dec 03 12:08:51 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:08:51 crc kubenswrapper[4702]: > Dec 03 12:08:51 crc kubenswrapper[4702]: I1203 12:08:51.928724 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:08:51 crc kubenswrapper[4702]: E1203 12:08:51.929047 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:08:53 crc kubenswrapper[4702]: I1203 12:08:53.446877 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.213:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:08:58 crc kubenswrapper[4702]: I1203 12:08:58.645791 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerStarted","Data":"a3ebf64628d67a49aab3d7c8c37af7c6fbdf0e61098265182dbf34743899b6e5"} Dec 03 12:08:58 crc kubenswrapper[4702]: I1203 12:08:58.649166 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5nhp" event={"ID":"a0f169f2-b89d-4c4a-b141-060d9e13386f","Type":"ContainerStarted","Data":"52c2a2eed48a7f27015e31d1b79ab17eef835b57ab5741e3cf24d665feb68918"} Dec 03 12:08:59 crc kubenswrapper[4702]: I1203 12:08:59.707400 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d5nhp" podStartSLOduration=9.192995327 podStartE2EDuration="1m7.707374352s" podCreationTimestamp="2025-12-03 12:07:52 +0000 UTC" firstStartedPulling="2025-12-03 12:07:58.565957658 +0000 UTC m=+3862.401886132" lastFinishedPulling="2025-12-03 12:08:57.080336693 +0000 UTC m=+3920.916265157" observedRunningTime="2025-12-03 12:08:59.700297121 +0000 UTC m=+3923.536225595" watchObservedRunningTime="2025-12-03 12:08:59.707374352 +0000 UTC m=+3923.543302816" Dec 03 12:09:00 crc kubenswrapper[4702]: I1203 12:09:00.147056 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:09:00 crc kubenswrapper[4702]: I1203 12:09:00.201138 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:09:00 crc kubenswrapper[4702]: I1203 12:09:00.917214 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fsqbq"] Dec 03 12:09:01 crc kubenswrapper[4702]: I1203 12:09:01.699731 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fsqbq" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="registry-server" containerID="cri-o://41be310109ace09c60b3084a70102863cf9772447abf009cfb2f53b9dea3c58f" gracePeriod=2 Dec 03 12:09:02 crc kubenswrapper[4702]: I1203 12:09:02.455729 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:09:02 crc kubenswrapper[4702]: I1203 12:09:02.455809 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:09:02 crc kubenswrapper[4702]: I1203 12:09:02.557081 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:09:02 crc kubenswrapper[4702]: I1203 12:09:02.756440 4702 generic.go:334] "Generic (PLEG): container finished" podID="2b228632-7c76-434a-827b-55c568b762d4" containerID="41be310109ace09c60b3084a70102863cf9772447abf009cfb2f53b9dea3c58f" exitCode=0 Dec 03 12:09:02 crc kubenswrapper[4702]: I1203 12:09:02.756548 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsqbq" event={"ID":"2b228632-7c76-434a-827b-55c568b762d4","Type":"ContainerDied","Data":"41be310109ace09c60b3084a70102863cf9772447abf009cfb2f53b9dea3c58f"} Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.197519 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.326279 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-catalog-content\") pod \"2b228632-7c76-434a-827b-55c568b762d4\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.326911 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-utilities\") pod \"2b228632-7c76-434a-827b-55c568b762d4\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.327159 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d542g\" (UniqueName: \"kubernetes.io/projected/2b228632-7c76-434a-827b-55c568b762d4-kube-api-access-d542g\") pod \"2b228632-7c76-434a-827b-55c568b762d4\" (UID: \"2b228632-7c76-434a-827b-55c568b762d4\") " Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.328344 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-utilities" (OuterVolumeSpecName: "utilities") pod "2b228632-7c76-434a-827b-55c568b762d4" (UID: "2b228632-7c76-434a-827b-55c568b762d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.346018 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b228632-7c76-434a-827b-55c568b762d4-kube-api-access-d542g" (OuterVolumeSpecName: "kube-api-access-d542g") pod "2b228632-7c76-434a-827b-55c568b762d4" (UID: "2b228632-7c76-434a-827b-55c568b762d4"). InnerVolumeSpecName "kube-api-access-d542g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.430376 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.430422 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d542g\" (UniqueName: \"kubernetes.io/projected/2b228632-7c76-434a-827b-55c568b762d4-kube-api-access-d542g\") on node \"crc\" DevicePath \"\"" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.449105 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b228632-7c76-434a-827b-55c568b762d4" (UID: "2b228632-7c76-434a-827b-55c568b762d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.533134 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b228632-7c76-434a-827b-55c568b762d4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.785235 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fsqbq" event={"ID":"2b228632-7c76-434a-827b-55c568b762d4","Type":"ContainerDied","Data":"c5e851041a25138df383c07c4c53a0d688e0ac89226244a3ad528b6f1864e2f0"} Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.785315 4702 scope.go:117] "RemoveContainer" containerID="41be310109ace09c60b3084a70102863cf9772447abf009cfb2f53b9dea3c58f" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.785490 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fsqbq" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.813433 4702 scope.go:117] "RemoveContainer" containerID="8b0fb8b9d8f78b18c46ddd5ee839b81a59ffb2644008ebe0063980aa5053e304" Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.831062 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fsqbq"] Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.840957 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fsqbq"] Dec 03 12:09:03 crc kubenswrapper[4702]: I1203 12:09:03.845469 4702 scope.go:117] "RemoveContainer" containerID="e30f6fa6c5260af9ba34cec61389a67ce9aaa72fe3c899909e602dc357b4cb27" Dec 03 12:09:04 crc kubenswrapper[4702]: I1203 12:09:04.941990 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b228632-7c76-434a-827b-55c568b762d4" path="/var/lib/kubelet/pods/2b228632-7c76-434a-827b-55c568b762d4/volumes" Dec 03 12:09:05 crc kubenswrapper[4702]: I1203 12:09:05.928255 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:09:05 crc kubenswrapper[4702]: E1203 12:09:05.928984 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:09:12 crc kubenswrapper[4702]: I1203 12:09:12.771801 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:09:12 crc kubenswrapper[4702]: I1203 12:09:12.829982 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5nhp"] Dec 03 12:09:12 crc kubenswrapper[4702]: I1203 12:09:12.905437 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d5nhp" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerName="registry-server" containerID="cri-o://52c2a2eed48a7f27015e31d1b79ab17eef835b57ab5741e3cf24d665feb68918" gracePeriod=2 Dec 03 12:09:13 crc kubenswrapper[4702]: I1203 12:09:13.929461 4702 generic.go:334] "Generic (PLEG): container finished" podID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerID="52c2a2eed48a7f27015e31d1b79ab17eef835b57ab5741e3cf24d665feb68918" exitCode=0 Dec 03 12:09:13 crc kubenswrapper[4702]: I1203 12:09:13.929539 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5nhp" event={"ID":"a0f169f2-b89d-4c4a-b141-060d9e13386f","Type":"ContainerDied","Data":"52c2a2eed48a7f27015e31d1b79ab17eef835b57ab5741e3cf24d665feb68918"} Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.372511 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.443137 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-catalog-content\") pod \"a0f169f2-b89d-4c4a-b141-060d9e13386f\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.443657 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqrt5\" (UniqueName: \"kubernetes.io/projected/a0f169f2-b89d-4c4a-b141-060d9e13386f-kube-api-access-dqrt5\") pod \"a0f169f2-b89d-4c4a-b141-060d9e13386f\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.443882 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-utilities\") pod \"a0f169f2-b89d-4c4a-b141-060d9e13386f\" (UID: \"a0f169f2-b89d-4c4a-b141-060d9e13386f\") " Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.444690 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-utilities" (OuterVolumeSpecName: "utilities") pod "a0f169f2-b89d-4c4a-b141-060d9e13386f" (UID: "a0f169f2-b89d-4c4a-b141-060d9e13386f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.446356 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.449538 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f169f2-b89d-4c4a-b141-060d9e13386f-kube-api-access-dqrt5" (OuterVolumeSpecName: "kube-api-access-dqrt5") pod "a0f169f2-b89d-4c4a-b141-060d9e13386f" (UID: "a0f169f2-b89d-4c4a-b141-060d9e13386f"). InnerVolumeSpecName "kube-api-access-dqrt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.466121 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0f169f2-b89d-4c4a-b141-060d9e13386f" (UID: "a0f169f2-b89d-4c4a-b141-060d9e13386f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.548876 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqrt5\" (UniqueName: \"kubernetes.io/projected/a0f169f2-b89d-4c4a-b141-060d9e13386f-kube-api-access-dqrt5\") on node \"crc\" DevicePath \"\"" Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.549177 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0f169f2-b89d-4c4a-b141-060d9e13386f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.959944 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d5nhp" Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.961701 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d5nhp" event={"ID":"a0f169f2-b89d-4c4a-b141-060d9e13386f","Type":"ContainerDied","Data":"f36b989d7c1b9e2ab6c32db99fe13635d42f94f74ae6524effa31589568bd9e6"} Dec 03 12:09:15 crc kubenswrapper[4702]: I1203 12:09:15.969918 4702 scope.go:117] "RemoveContainer" containerID="52c2a2eed48a7f27015e31d1b79ab17eef835b57ab5741e3cf24d665feb68918" Dec 03 12:09:16 crc kubenswrapper[4702]: I1203 12:09:16.016802 4702 scope.go:117] "RemoveContainer" containerID="cdad30530c434124eb4cef4dd0cb48baf69e25450e39483bb4ebdbe3d79f1007" Dec 03 12:09:16 crc kubenswrapper[4702]: I1203 12:09:16.030915 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5nhp"] Dec 03 12:09:16 crc kubenswrapper[4702]: I1203 12:09:16.039462 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d5nhp"] Dec 03 12:09:16 crc kubenswrapper[4702]: I1203 12:09:16.045848 4702 scope.go:117] "RemoveContainer" containerID="179e8e899e74fb08c060aa88185f71cdb0d836e5dcc2830694f132cd81403573" Dec 03 12:09:16 crc kubenswrapper[4702]: I1203 12:09:16.945317 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" path="/var/lib/kubelet/pods/a0f169f2-b89d-4c4a-b141-060d9e13386f/volumes" Dec 03 12:09:20 crc kubenswrapper[4702]: I1203 12:09:20.928163 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:09:20 crc kubenswrapper[4702]: E1203 12:09:20.928893 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:09:31 crc kubenswrapper[4702]: I1203 12:09:31.930670 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:09:31 crc kubenswrapper[4702]: E1203 12:09:31.932071 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:09:43 crc kubenswrapper[4702]: I1203 12:09:43.928899 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:09:43 crc kubenswrapper[4702]: E1203 12:09:43.929599 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:09:55 crc kubenswrapper[4702]: I1203 12:09:55.928972 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:09:55 crc kubenswrapper[4702]: E1203 12:09:55.929992 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:10:10 crc kubenswrapper[4702]: I1203 12:10:10.928778 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:10:10 crc kubenswrapper[4702]: E1203 12:10:10.929654 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:10:21 crc kubenswrapper[4702]: I1203 12:10:21.929214 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:10:21 crc kubenswrapper[4702]: E1203 12:10:21.930150 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:10:32 crc kubenswrapper[4702]: I1203 12:10:32.937021 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:10:32 crc kubenswrapper[4702]: E1203 12:10:32.938098 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:10:43 crc kubenswrapper[4702]: I1203 12:10:43.928712 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:10:43 crc kubenswrapper[4702]: E1203 12:10:43.929596 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:10:54 crc kubenswrapper[4702]: I1203 12:10:54.929845 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:10:54 crc kubenswrapper[4702]: E1203 12:10:54.930773 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:11:06 crc kubenswrapper[4702]: I1203 12:11:06.940127 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:11:06 crc kubenswrapper[4702]: E1203 12:11:06.941164 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:11:21 crc kubenswrapper[4702]: I1203 12:11:21.928813 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:11:21 crc kubenswrapper[4702]: E1203 12:11:21.929703 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:11:31 crc kubenswrapper[4702]: I1203 12:11:31.576563 4702 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-bhqrp container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:11:31 crc kubenswrapper[4702]: I1203 12:11:31.577248 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podUID="042cc406-7960-493a-a19a-cb5590f8ff1f" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:11:35 crc kubenswrapper[4702]: I1203 12:11:35.928267 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:11:35 crc kubenswrapper[4702]: E1203 12:11:35.928854 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:11:49 crc kubenswrapper[4702]: I1203 12:11:49.930015 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:11:49 crc kubenswrapper[4702]: E1203 12:11:49.930705 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.901331 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4rp94"] Dec 03 12:11:56 crc kubenswrapper[4702]: E1203 12:11:56.902747 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerName="extract-utilities" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.902792 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerName="extract-utilities" Dec 03 12:11:56 crc kubenswrapper[4702]: E1203 12:11:56.902828 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="extract-utilities" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.902841 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="extract-utilities" Dec 03 12:11:56 crc kubenswrapper[4702]: E1203 12:11:56.902880 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="registry-server" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.902896 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="registry-server" Dec 03 12:11:56 crc kubenswrapper[4702]: E1203 12:11:56.902933 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerName="extract-content" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.902948 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerName="extract-content" Dec 03 12:11:56 crc kubenswrapper[4702]: E1203 12:11:56.902997 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerName="registry-server" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.903009 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerName="registry-server" Dec 03 12:11:56 crc kubenswrapper[4702]: E1203 12:11:56.903043 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="extract-content" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.903092 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="extract-content" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.903691 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b228632-7c76-434a-827b-55c568b762d4" containerName="registry-server" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.903839 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0f169f2-b89d-4c4a-b141-060d9e13386f" containerName="registry-server" Dec 03 12:11:56 crc kubenswrapper[4702]: I1203 12:11:56.907337 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.019280 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-utilities\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.019509 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-catalog-content\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.019655 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb57b\" (UniqueName: \"kubernetes.io/projected/e5390395-7dc6-4062-b463-76b126ed747c-kube-api-access-rb57b\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.054628 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rp94"] Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.122236 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-catalog-content\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.122650 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb57b\" (UniqueName: \"kubernetes.io/projected/e5390395-7dc6-4062-b463-76b126ed747c-kube-api-access-rb57b\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.122846 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-utilities\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.122848 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-catalog-content\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.123232 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-utilities\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.150179 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb57b\" (UniqueName: \"kubernetes.io/projected/e5390395-7dc6-4062-b463-76b126ed747c-kube-api-access-rb57b\") pod \"community-operators-4rp94\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.238129 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:11:57 crc kubenswrapper[4702]: I1203 12:11:57.824411 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4rp94"] Dec 03 12:11:58 crc kubenswrapper[4702]: I1203 12:11:58.824344 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rp94" event={"ID":"e5390395-7dc6-4062-b463-76b126ed747c","Type":"ContainerStarted","Data":"9bb53ad657b43750db2ed617d2aebec3ca084fa85ccf9568967ad051e100932f"} Dec 03 12:12:01 crc kubenswrapper[4702]: I1203 12:12:01.866023 4702 generic.go:334] "Generic (PLEG): container finished" podID="e5390395-7dc6-4062-b463-76b126ed747c" containerID="b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd" exitCode=0 Dec 03 12:12:01 crc kubenswrapper[4702]: I1203 12:12:01.866081 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rp94" event={"ID":"e5390395-7dc6-4062-b463-76b126ed747c","Type":"ContainerDied","Data":"b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd"} Dec 03 12:12:02 crc kubenswrapper[4702]: I1203 12:12:02.934283 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:12:02 crc kubenswrapper[4702]: E1203 12:12:02.935170 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:12:07 crc kubenswrapper[4702]: I1203 12:12:07.946370 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rp94" event={"ID":"e5390395-7dc6-4062-b463-76b126ed747c","Type":"ContainerStarted","Data":"5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221"} Dec 03 12:12:14 crc kubenswrapper[4702]: I1203 12:12:14.404224 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.213:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:12:16 crc kubenswrapper[4702]: I1203 12:12:16.091160 4702 generic.go:334] "Generic (PLEG): container finished" podID="e5390395-7dc6-4062-b463-76b126ed747c" containerID="5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221" exitCode=0 Dec 03 12:12:16 crc kubenswrapper[4702]: I1203 12:12:16.092337 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rp94" event={"ID":"e5390395-7dc6-4062-b463-76b126ed747c","Type":"ContainerDied","Data":"5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221"} Dec 03 12:12:17 crc kubenswrapper[4702]: I1203 12:12:17.928995 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:12:17 crc kubenswrapper[4702]: E1203 12:12:17.930038 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:12:20 crc kubenswrapper[4702]: I1203 12:12:20.150118 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rp94" event={"ID":"e5390395-7dc6-4062-b463-76b126ed747c","Type":"ContainerStarted","Data":"69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8"} Dec 03 12:12:20 crc kubenswrapper[4702]: I1203 12:12:20.197528 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4rp94" podStartSLOduration=7.004319023 podStartE2EDuration="24.197465311s" podCreationTimestamp="2025-12-03 12:11:56 +0000 UTC" firstStartedPulling="2025-12-03 12:12:01.871989723 +0000 UTC m=+4105.707918237" lastFinishedPulling="2025-12-03 12:12:19.065136061 +0000 UTC m=+4122.901064525" observedRunningTime="2025-12-03 12:12:20.181371932 +0000 UTC m=+4124.017300396" watchObservedRunningTime="2025-12-03 12:12:20.197465311 +0000 UTC m=+4124.033393785" Dec 03 12:12:27 crc kubenswrapper[4702]: I1203 12:12:27.239168 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:12:27 crc kubenswrapper[4702]: I1203 12:12:27.239747 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:12:27 crc kubenswrapper[4702]: I1203 12:12:27.306184 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:12:27 crc kubenswrapper[4702]: I1203 12:12:27.364552 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:12:28 crc kubenswrapper[4702]: I1203 12:12:28.097723 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rp94"] Dec 03 12:12:28 crc kubenswrapper[4702]: I1203 12:12:28.929696 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:12:28 crc kubenswrapper[4702]: E1203 12:12:28.931073 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:12:29 crc kubenswrapper[4702]: I1203 12:12:29.287087 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4rp94" podUID="e5390395-7dc6-4062-b463-76b126ed747c" containerName="registry-server" containerID="cri-o://69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8" gracePeriod=2 Dec 03 12:12:29 crc kubenswrapper[4702]: I1203 12:12:29.913266 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.105907 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-utilities\") pod \"e5390395-7dc6-4062-b463-76b126ed747c\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.106335 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rb57b\" (UniqueName: \"kubernetes.io/projected/e5390395-7dc6-4062-b463-76b126ed747c-kube-api-access-rb57b\") pod \"e5390395-7dc6-4062-b463-76b126ed747c\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.106411 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-catalog-content\") pod \"e5390395-7dc6-4062-b463-76b126ed747c\" (UID: \"e5390395-7dc6-4062-b463-76b126ed747c\") " Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.119417 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-utilities" (OuterVolumeSpecName: "utilities") pod "e5390395-7dc6-4062-b463-76b126ed747c" (UID: "e5390395-7dc6-4062-b463-76b126ed747c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.120686 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5390395-7dc6-4062-b463-76b126ed747c-kube-api-access-rb57b" (OuterVolumeSpecName: "kube-api-access-rb57b") pod "e5390395-7dc6-4062-b463-76b126ed747c" (UID: "e5390395-7dc6-4062-b463-76b126ed747c"). InnerVolumeSpecName "kube-api-access-rb57b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.173525 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5390395-7dc6-4062-b463-76b126ed747c" (UID: "e5390395-7dc6-4062-b463-76b126ed747c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.210579 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.210630 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rb57b\" (UniqueName: \"kubernetes.io/projected/e5390395-7dc6-4062-b463-76b126ed747c-kube-api-access-rb57b\") on node \"crc\" DevicePath \"\"" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.210646 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5390395-7dc6-4062-b463-76b126ed747c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.307452 4702 generic.go:334] "Generic (PLEG): container finished" podID="e5390395-7dc6-4062-b463-76b126ed747c" containerID="69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8" exitCode=0 Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.307527 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rp94" event={"ID":"e5390395-7dc6-4062-b463-76b126ed747c","Type":"ContainerDied","Data":"69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8"} Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.307537 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4rp94" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.307580 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4rp94" event={"ID":"e5390395-7dc6-4062-b463-76b126ed747c","Type":"ContainerDied","Data":"9bb53ad657b43750db2ed617d2aebec3ca084fa85ccf9568967ad051e100932f"} Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.307609 4702 scope.go:117] "RemoveContainer" containerID="69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.379271 4702 scope.go:117] "RemoveContainer" containerID="5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.394575 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4rp94"] Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.411913 4702 scope.go:117] "RemoveContainer" containerID="b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.412277 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4rp94"] Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.479685 4702 scope.go:117] "RemoveContainer" containerID="69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8" Dec 03 12:12:30 crc kubenswrapper[4702]: E1203 12:12:30.480382 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8\": container with ID starting with 69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8 not found: ID does not exist" containerID="69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.480467 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8"} err="failed to get container status \"69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8\": rpc error: code = NotFound desc = could not find container \"69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8\": container with ID starting with 69d066fa7107fb1fd9dfee388b272948cb105eaf710b0982b944b293375dfaf8 not found: ID does not exist" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.480509 4702 scope.go:117] "RemoveContainer" containerID="5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221" Dec 03 12:12:30 crc kubenswrapper[4702]: E1203 12:12:30.481495 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221\": container with ID starting with 5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221 not found: ID does not exist" containerID="5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.481557 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221"} err="failed to get container status \"5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221\": rpc error: code = NotFound desc = could not find container \"5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221\": container with ID starting with 5e0b014c266bf14be60897f6910d8cf15dc7142d33c256dd79be2d21650e0221 not found: ID does not exist" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.481592 4702 scope.go:117] "RemoveContainer" containerID="b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd" Dec 03 12:12:30 crc kubenswrapper[4702]: E1203 12:12:30.482437 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd\": container with ID starting with b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd not found: ID does not exist" containerID="b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.482476 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd"} err="failed to get container status \"b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd\": rpc error: code = NotFound desc = could not find container \"b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd\": container with ID starting with b6df9832c690e08abf6da47af861d8e7636673aea4d388ac7d42e6f6caf1b1cd not found: ID does not exist" Dec 03 12:12:30 crc kubenswrapper[4702]: I1203 12:12:30.947109 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5390395-7dc6-4062-b463-76b126ed747c" path="/var/lib/kubelet/pods/e5390395-7dc6-4062-b463-76b126ed747c/volumes" Dec 03 12:12:41 crc kubenswrapper[4702]: I1203 12:12:41.266054 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:12:41 crc kubenswrapper[4702]: E1203 12:12:41.267014 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:12:53 crc kubenswrapper[4702]: I1203 12:12:53.928657 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:12:53 crc kubenswrapper[4702]: E1203 12:12:53.929690 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:13:08 crc kubenswrapper[4702]: I1203 12:13:08.928716 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:13:08 crc kubenswrapper[4702]: E1203 12:13:08.929830 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:13:22 crc kubenswrapper[4702]: I1203 12:13:22.928100 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:13:22 crc kubenswrapper[4702]: E1203 12:13:22.929069 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:13:34 crc kubenswrapper[4702]: I1203 12:13:34.928629 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:13:36 crc kubenswrapper[4702]: I1203 12:13:36.249915 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"39f744a96b947ad71dac3cd2fa7cfeca49ebfdcba6909ec102d28a09fbaa619a"} Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.472145 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-74nlg"] Dec 03 12:14:17 crc kubenswrapper[4702]: E1203 12:14:17.475877 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5390395-7dc6-4062-b463-76b126ed747c" containerName="extract-content" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.475915 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5390395-7dc6-4062-b463-76b126ed747c" containerName="extract-content" Dec 03 12:14:17 crc kubenswrapper[4702]: E1203 12:14:17.475955 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5390395-7dc6-4062-b463-76b126ed747c" containerName="registry-server" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.475966 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5390395-7dc6-4062-b463-76b126ed747c" containerName="registry-server" Dec 03 12:14:17 crc kubenswrapper[4702]: E1203 12:14:17.475998 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5390395-7dc6-4062-b463-76b126ed747c" containerName="extract-utilities" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.476007 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5390395-7dc6-4062-b463-76b126ed747c" containerName="extract-utilities" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.476435 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5390395-7dc6-4062-b463-76b126ed747c" containerName="registry-server" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.480068 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.496181 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74nlg"] Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.606433 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-utilities\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.607450 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-catalog-content\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.608317 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78tt5\" (UniqueName: \"kubernetes.io/projected/8cb07724-8509-4fe4-a29c-0f69a29e66d5-kube-api-access-78tt5\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.748326 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-catalog-content\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.748730 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78tt5\" (UniqueName: \"kubernetes.io/projected/8cb07724-8509-4fe4-a29c-0f69a29e66d5-kube-api-access-78tt5\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.748791 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-utilities\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.749110 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-catalog-content\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.749353 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-utilities\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.787138 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78tt5\" (UniqueName: \"kubernetes.io/projected/8cb07724-8509-4fe4-a29c-0f69a29e66d5-kube-api-access-78tt5\") pod \"certified-operators-74nlg\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:17 crc kubenswrapper[4702]: I1203 12:14:17.823303 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:18 crc kubenswrapper[4702]: I1203 12:14:18.445923 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-74nlg"] Dec 03 12:14:19 crc kubenswrapper[4702]: I1203 12:14:19.299048 4702 generic.go:334] "Generic (PLEG): container finished" podID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerID="3cd01bb369bbab318c44d3a9dae5ac3078443cd3b7122c2a4cddf2789105060b" exitCode=0 Dec 03 12:14:19 crc kubenswrapper[4702]: I1203 12:14:19.299154 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74nlg" event={"ID":"8cb07724-8509-4fe4-a29c-0f69a29e66d5","Type":"ContainerDied","Data":"3cd01bb369bbab318c44d3a9dae5ac3078443cd3b7122c2a4cddf2789105060b"} Dec 03 12:14:19 crc kubenswrapper[4702]: I1203 12:14:19.299618 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74nlg" event={"ID":"8cb07724-8509-4fe4-a29c-0f69a29e66d5","Type":"ContainerStarted","Data":"f7834de58a57969b745d5fe6ff192e07c7c26fb47aded6936ecdf6ed5a3fd1ad"} Dec 03 12:14:19 crc kubenswrapper[4702]: I1203 12:14:19.302156 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:14:21 crc kubenswrapper[4702]: I1203 12:14:21.326558 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74nlg" event={"ID":"8cb07724-8509-4fe4-a29c-0f69a29e66d5","Type":"ContainerStarted","Data":"ace8d3acaae40219cce669ac5dc5493b0648bc4491ce6eb0c69941be3d45cc6a"} Dec 03 12:14:22 crc kubenswrapper[4702]: I1203 12:14:22.527754 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:14:22 crc kubenswrapper[4702]: I1203 12:14:22.527789 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:14:23 crc kubenswrapper[4702]: I1203 12:14:23.164053 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.295910431s: [/var/lib/containers/storage/overlay/59c6e53ae61f6f66409200f88bf3d934482c7c7769b14ea1046aa3475d997b3d/diff /var/log/pods/openstack_nova-api-0_7455976c-e312-4b2a-963f-6e75d428c41c/nova-api-api/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:14:25 crc kubenswrapper[4702]: I1203 12:14:25.399693 4702 generic.go:334] "Generic (PLEG): container finished" podID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerID="ace8d3acaae40219cce669ac5dc5493b0648bc4491ce6eb0c69941be3d45cc6a" exitCode=0 Dec 03 12:14:25 crc kubenswrapper[4702]: I1203 12:14:25.399750 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74nlg" event={"ID":"8cb07724-8509-4fe4-a29c-0f69a29e66d5","Type":"ContainerDied","Data":"ace8d3acaae40219cce669ac5dc5493b0648bc4491ce6eb0c69941be3d45cc6a"} Dec 03 12:14:27 crc kubenswrapper[4702]: I1203 12:14:27.427358 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74nlg" event={"ID":"8cb07724-8509-4fe4-a29c-0f69a29e66d5","Type":"ContainerStarted","Data":"170a31f1112a698330646123bee8045493cd5f3831af5b03c11ef710c510ec9c"} Dec 03 12:14:27 crc kubenswrapper[4702]: I1203 12:14:27.465288 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-74nlg" podStartSLOduration=3.289089768 podStartE2EDuration="10.465262572s" podCreationTimestamp="2025-12-03 12:14:17 +0000 UTC" firstStartedPulling="2025-12-03 12:14:19.301682661 +0000 UTC m=+4243.137611125" lastFinishedPulling="2025-12-03 12:14:26.477855465 +0000 UTC m=+4250.313783929" observedRunningTime="2025-12-03 12:14:27.452512978 +0000 UTC m=+4251.288441442" watchObservedRunningTime="2025-12-03 12:14:27.465262572 +0000 UTC m=+4251.301191056" Dec 03 12:14:27 crc kubenswrapper[4702]: I1203 12:14:27.823651 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:27 crc kubenswrapper[4702]: I1203 12:14:27.823713 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:28 crc kubenswrapper[4702]: I1203 12:14:28.874845 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-74nlg" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="registry-server" probeResult="failure" output=< Dec 03 12:14:28 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:14:28 crc kubenswrapper[4702]: > Dec 03 12:14:37 crc kubenswrapper[4702]: I1203 12:14:37.882594 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:37 crc kubenswrapper[4702]: I1203 12:14:37.945578 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:38 crc kubenswrapper[4702]: I1203 12:14:38.125533 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74nlg"] Dec 03 12:14:39 crc kubenswrapper[4702]: I1203 12:14:39.566905 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-74nlg" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="registry-server" containerID="cri-o://170a31f1112a698330646123bee8045493cd5f3831af5b03c11ef710c510ec9c" gracePeriod=2 Dec 03 12:14:39 crc kubenswrapper[4702]: E1203 12:14:39.727664 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cb07724_8509_4fe4_a29c_0f69a29e66d5.slice/crio-170a31f1112a698330646123bee8045493cd5f3831af5b03c11ef710c510ec9c.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.615416 4702 generic.go:334] "Generic (PLEG): container finished" podID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerID="170a31f1112a698330646123bee8045493cd5f3831af5b03c11ef710c510ec9c" exitCode=0 Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.615789 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74nlg" event={"ID":"8cb07724-8509-4fe4-a29c-0f69a29e66d5","Type":"ContainerDied","Data":"170a31f1112a698330646123bee8045493cd5f3831af5b03c11ef710c510ec9c"} Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.756225 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.906422 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-utilities\") pod \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.907072 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-catalog-content\") pod \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.907122 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78tt5\" (UniqueName: \"kubernetes.io/projected/8cb07724-8509-4fe4-a29c-0f69a29e66d5-kube-api-access-78tt5\") pod \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\" (UID: \"8cb07724-8509-4fe4-a29c-0f69a29e66d5\") " Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.907538 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-utilities" (OuterVolumeSpecName: "utilities") pod "8cb07724-8509-4fe4-a29c-0f69a29e66d5" (UID: "8cb07724-8509-4fe4-a29c-0f69a29e66d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.908151 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.915173 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb07724-8509-4fe4-a29c-0f69a29e66d5-kube-api-access-78tt5" (OuterVolumeSpecName: "kube-api-access-78tt5") pod "8cb07724-8509-4fe4-a29c-0f69a29e66d5" (UID: "8cb07724-8509-4fe4-a29c-0f69a29e66d5"). InnerVolumeSpecName "kube-api-access-78tt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:14:40 crc kubenswrapper[4702]: I1203 12:14:40.967657 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8cb07724-8509-4fe4-a29c-0f69a29e66d5" (UID: "8cb07724-8509-4fe4-a29c-0f69a29e66d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.010968 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8cb07724-8509-4fe4-a29c-0f69a29e66d5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.011012 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78tt5\" (UniqueName: \"kubernetes.io/projected/8cb07724-8509-4fe4-a29c-0f69a29e66d5-kube-api-access-78tt5\") on node \"crc\" DevicePath \"\"" Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.639318 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-74nlg" event={"ID":"8cb07724-8509-4fe4-a29c-0f69a29e66d5","Type":"ContainerDied","Data":"f7834de58a57969b745d5fe6ff192e07c7c26fb47aded6936ecdf6ed5a3fd1ad"} Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.639397 4702 scope.go:117] "RemoveContainer" containerID="170a31f1112a698330646123bee8045493cd5f3831af5b03c11ef710c510ec9c" Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.639627 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-74nlg" Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.693792 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-74nlg"] Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.697393 4702 scope.go:117] "RemoveContainer" containerID="ace8d3acaae40219cce669ac5dc5493b0648bc4491ce6eb0c69941be3d45cc6a" Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.723582 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-74nlg"] Dec 03 12:14:41 crc kubenswrapper[4702]: I1203 12:14:41.729507 4702 scope.go:117] "RemoveContainer" containerID="3cd01bb369bbab318c44d3a9dae5ac3078443cd3b7122c2a4cddf2789105060b" Dec 03 12:14:42 crc kubenswrapper[4702]: I1203 12:14:42.947282 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" path="/var/lib/kubelet/pods/8cb07724-8509-4fe4-a29c-0f69a29e66d5/volumes" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.175489 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng"] Dec 03 12:15:00 crc kubenswrapper[4702]: E1203 12:15:00.176530 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="registry-server" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.176546 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="registry-server" Dec 03 12:15:00 crc kubenswrapper[4702]: E1203 12:15:00.176601 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="extract-content" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.176611 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="extract-content" Dec 03 12:15:00 crc kubenswrapper[4702]: E1203 12:15:00.176661 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="extract-utilities" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.176670 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="extract-utilities" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.176975 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb07724-8509-4fe4-a29c-0f69a29e66d5" containerName="registry-server" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.178263 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.180720 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.193130 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng"] Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.196700 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.296355 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m95pv\" (UniqueName: \"kubernetes.io/projected/0903bec9-6564-4134-b007-3e46b7a7b95b-kube-api-access-m95pv\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.296492 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0903bec9-6564-4134-b007-3e46b7a7b95b-config-volume\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.296546 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0903bec9-6564-4134-b007-3e46b7a7b95b-secret-volume\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.398417 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m95pv\" (UniqueName: \"kubernetes.io/projected/0903bec9-6564-4134-b007-3e46b7a7b95b-kube-api-access-m95pv\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.398795 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0903bec9-6564-4134-b007-3e46b7a7b95b-config-volume\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.398960 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0903bec9-6564-4134-b007-3e46b7a7b95b-secret-volume\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.399686 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0903bec9-6564-4134-b007-3e46b7a7b95b-config-volume\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.627224 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0903bec9-6564-4134-b007-3e46b7a7b95b-secret-volume\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.634899 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m95pv\" (UniqueName: \"kubernetes.io/projected/0903bec9-6564-4134-b007-3e46b7a7b95b-kube-api-access-m95pv\") pod \"collect-profiles-29412735-pz8ng\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:00 crc kubenswrapper[4702]: I1203 12:15:00.815725 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:01 crc kubenswrapper[4702]: I1203 12:15:01.365377 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng"] Dec 03 12:15:01 crc kubenswrapper[4702]: I1203 12:15:01.910776 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" event={"ID":"0903bec9-6564-4134-b007-3e46b7a7b95b","Type":"ContainerStarted","Data":"0916bd149dcd95ada98894c725c372669eeeab9daccce53c79467d3bc4eb9b69"} Dec 03 12:15:01 crc kubenswrapper[4702]: I1203 12:15:01.911109 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" event={"ID":"0903bec9-6564-4134-b007-3e46b7a7b95b","Type":"ContainerStarted","Data":"cedeeeac30e531b742d2bc5cc0c94e05627919dd77f4d93f0eb4edca35912b83"} Dec 03 12:15:01 crc kubenswrapper[4702]: I1203 12:15:01.944000 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" podStartSLOduration=1.943976845 podStartE2EDuration="1.943976845s" podCreationTimestamp="2025-12-03 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 12:15:01.929209324 +0000 UTC m=+4285.765137788" watchObservedRunningTime="2025-12-03 12:15:01.943976845 +0000 UTC m=+4285.779905329" Dec 03 12:15:02 crc kubenswrapper[4702]: I1203 12:15:02.926748 4702 generic.go:334] "Generic (PLEG): container finished" podID="0903bec9-6564-4134-b007-3e46b7a7b95b" containerID="0916bd149dcd95ada98894c725c372669eeeab9daccce53c79467d3bc4eb9b69" exitCode=0 Dec 03 12:15:02 crc kubenswrapper[4702]: I1203 12:15:02.927064 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" event={"ID":"0903bec9-6564-4134-b007-3e46b7a7b95b","Type":"ContainerDied","Data":"0916bd149dcd95ada98894c725c372669eeeab9daccce53c79467d3bc4eb9b69"} Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.414421 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.529328 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0903bec9-6564-4134-b007-3e46b7a7b95b-secret-volume\") pod \"0903bec9-6564-4134-b007-3e46b7a7b95b\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.529492 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m95pv\" (UniqueName: \"kubernetes.io/projected/0903bec9-6564-4134-b007-3e46b7a7b95b-kube-api-access-m95pv\") pod \"0903bec9-6564-4134-b007-3e46b7a7b95b\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.529518 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0903bec9-6564-4134-b007-3e46b7a7b95b-config-volume\") pod \"0903bec9-6564-4134-b007-3e46b7a7b95b\" (UID: \"0903bec9-6564-4134-b007-3e46b7a7b95b\") " Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.530283 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0903bec9-6564-4134-b007-3e46b7a7b95b-config-volume" (OuterVolumeSpecName: "config-volume") pod "0903bec9-6564-4134-b007-3e46b7a7b95b" (UID: "0903bec9-6564-4134-b007-3e46b7a7b95b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.530713 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0903bec9-6564-4134-b007-3e46b7a7b95b-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.535946 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0903bec9-6564-4134-b007-3e46b7a7b95b-kube-api-access-m95pv" (OuterVolumeSpecName: "kube-api-access-m95pv") pod "0903bec9-6564-4134-b007-3e46b7a7b95b" (UID: "0903bec9-6564-4134-b007-3e46b7a7b95b"). InnerVolumeSpecName "kube-api-access-m95pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.541199 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0903bec9-6564-4134-b007-3e46b7a7b95b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0903bec9-6564-4134-b007-3e46b7a7b95b" (UID: "0903bec9-6564-4134-b007-3e46b7a7b95b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.632976 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m95pv\" (UniqueName: \"kubernetes.io/projected/0903bec9-6564-4134-b007-3e46b7a7b95b-kube-api-access-m95pv\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.633017 4702 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0903bec9-6564-4134-b007-3e46b7a7b95b-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.953363 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" event={"ID":"0903bec9-6564-4134-b007-3e46b7a7b95b","Type":"ContainerDied","Data":"cedeeeac30e531b742d2bc5cc0c94e05627919dd77f4d93f0eb4edca35912b83"} Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.953409 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cedeeeac30e531b742d2bc5cc0c94e05627919dd77f4d93f0eb4edca35912b83" Dec 03 12:15:04 crc kubenswrapper[4702]: I1203 12:15:04.953467 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412735-pz8ng" Dec 03 12:15:05 crc kubenswrapper[4702]: I1203 12:15:05.506349 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp"] Dec 03 12:15:05 crc kubenswrapper[4702]: I1203 12:15:05.517561 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412690-t4nmp"] Dec 03 12:15:06 crc kubenswrapper[4702]: I1203 12:15:06.950100 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8522bf51-2bbc-49dd-b7f8-56abb9ee6cae" path="/var/lib/kubelet/pods/8522bf51-2bbc-49dd-b7f8-56abb9ee6cae/volumes" Dec 03 12:15:29 crc kubenswrapper[4702]: I1203 12:15:29.465514 4702 scope.go:117] "RemoveContainer" containerID="fb4d9241855e57eff87ccc34d7c9e3440d1e9efbf264701eece7a2b3b76f01d1" Dec 03 12:15:35 crc kubenswrapper[4702]: E1203 12:15:35.414611 4702 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: , extraDiskErr: could not stat "/var/log/pods/openstack_memcached-0_8ea851b4-124d-4472-9fd0-7b584da44ecc/memcached/0.log" to get inode usage: stat /var/log/pods/openstack_memcached-0_8ea851b4-124d-4472-9fd0-7b584da44ecc/memcached/0.log: no such file or directory Dec 03 12:15:55 crc kubenswrapper[4702]: I1203 12:15:55.908035 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:15:55 crc kubenswrapper[4702]: I1203 12:15:55.908674 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:16:25 crc kubenswrapper[4702]: I1203 12:16:25.907877 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:16:25 crc kubenswrapper[4702]: I1203 12:16:25.908489 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:16:55 crc kubenswrapper[4702]: I1203 12:16:55.907960 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:16:55 crc kubenswrapper[4702]: I1203 12:16:55.908521 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:16:55 crc kubenswrapper[4702]: I1203 12:16:55.908594 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:16:55 crc kubenswrapper[4702]: I1203 12:16:55.909808 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39f744a96b947ad71dac3cd2fa7cfeca49ebfdcba6909ec102d28a09fbaa619a"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:16:55 crc kubenswrapper[4702]: I1203 12:16:55.909890 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://39f744a96b947ad71dac3cd2fa7cfeca49ebfdcba6909ec102d28a09fbaa619a" gracePeriod=600 Dec 03 12:16:56 crc kubenswrapper[4702]: I1203 12:16:56.720585 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="39f744a96b947ad71dac3cd2fa7cfeca49ebfdcba6909ec102d28a09fbaa619a" exitCode=0 Dec 03 12:16:56 crc kubenswrapper[4702]: I1203 12:16:56.720644 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"39f744a96b947ad71dac3cd2fa7cfeca49ebfdcba6909ec102d28a09fbaa619a"} Dec 03 12:16:56 crc kubenswrapper[4702]: I1203 12:16:56.721068 4702 scope.go:117] "RemoveContainer" containerID="e012e3830197daf61f8f99e3a362e07ea730d6642de61b7f7eafd34e7f7d41ee" Dec 03 12:16:57 crc kubenswrapper[4702]: I1203 12:16:57.742688 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6"} Dec 03 12:16:58 crc kubenswrapper[4702]: E1203 12:16:58.261173 4702 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.176:44516->38.102.83.176:40897: write tcp 38.102.83.176:44516->38.102.83.176:40897: write: broken pipe Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.600555 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8lwjp"] Dec 03 12:18:36 crc kubenswrapper[4702]: E1203 12:18:36.601913 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0903bec9-6564-4134-b007-3e46b7a7b95b" containerName="collect-profiles" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.601935 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0903bec9-6564-4134-b007-3e46b7a7b95b" containerName="collect-profiles" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.602376 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="0903bec9-6564-4134-b007-3e46b7a7b95b" containerName="collect-profiles" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.613305 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.633777 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lwjp"] Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.714599 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnmj8\" (UniqueName: \"kubernetes.io/projected/2631b0b0-32bb-412a-9899-8b2e1325df36-kube-api-access-bnmj8\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.714812 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-catalog-content\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.715334 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-utilities\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.817511 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-utilities\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.817649 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnmj8\" (UniqueName: \"kubernetes.io/projected/2631b0b0-32bb-412a-9899-8b2e1325df36-kube-api-access-bnmj8\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.817780 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-catalog-content\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.818209 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-utilities\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.819845 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-catalog-content\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.843705 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnmj8\" (UniqueName: \"kubernetes.io/projected/2631b0b0-32bb-412a-9899-8b2e1325df36-kube-api-access-bnmj8\") pod \"redhat-operators-8lwjp\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:36 crc kubenswrapper[4702]: I1203 12:18:36.954484 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:18:37 crc kubenswrapper[4702]: I1203 12:18:37.491883 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8lwjp"] Dec 03 12:18:38 crc kubenswrapper[4702]: I1203 12:18:38.199540 4702 generic.go:334] "Generic (PLEG): container finished" podID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerID="811261043cc19c1dcacdc8af2143c42d5b3e363ad440bc7f4bbae83bb57b510a" exitCode=0 Dec 03 12:18:38 crc kubenswrapper[4702]: I1203 12:18:38.199978 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lwjp" event={"ID":"2631b0b0-32bb-412a-9899-8b2e1325df36","Type":"ContainerDied","Data":"811261043cc19c1dcacdc8af2143c42d5b3e363ad440bc7f4bbae83bb57b510a"} Dec 03 12:18:38 crc kubenswrapper[4702]: I1203 12:18:38.200019 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lwjp" event={"ID":"2631b0b0-32bb-412a-9899-8b2e1325df36","Type":"ContainerStarted","Data":"99cea139103d2f5f9496c1da2fb57844b73d0bc467c3ce87d83db29bc46db2e9"} Dec 03 12:18:40 crc kubenswrapper[4702]: I1203 12:18:40.225974 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lwjp" event={"ID":"2631b0b0-32bb-412a-9899-8b2e1325df36","Type":"ContainerStarted","Data":"b452b974a8b4305de3f187a0bcc976e70c0030023566aca7e430344238654d61"} Dec 03 12:18:53 crc kubenswrapper[4702]: I1203 12:18:53.399832 4702 generic.go:334] "Generic (PLEG): container finished" podID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerID="b452b974a8b4305de3f187a0bcc976e70c0030023566aca7e430344238654d61" exitCode=0 Dec 03 12:18:53 crc kubenswrapper[4702]: I1203 12:18:53.399925 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lwjp" event={"ID":"2631b0b0-32bb-412a-9899-8b2e1325df36","Type":"ContainerDied","Data":"b452b974a8b4305de3f187a0bcc976e70c0030023566aca7e430344238654d61"} Dec 03 12:18:56 crc kubenswrapper[4702]: I1203 12:18:56.438590 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lwjp" event={"ID":"2631b0b0-32bb-412a-9899-8b2e1325df36","Type":"ContainerStarted","Data":"0df9fff55a53168ba8257a552fb60630f62990172198685e208bacfd23536fd7"} Dec 03 12:18:57 crc kubenswrapper[4702]: I1203 12:18:57.465860 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8lwjp" podStartSLOduration=4.174688813 podStartE2EDuration="21.465831522s" podCreationTimestamp="2025-12-03 12:18:36 +0000 UTC" firstStartedPulling="2025-12-03 12:18:38.205016336 +0000 UTC m=+4502.040944800" lastFinishedPulling="2025-12-03 12:18:55.496159055 +0000 UTC m=+4519.332087509" observedRunningTime="2025-12-03 12:18:57.465165703 +0000 UTC m=+4521.301094187" watchObservedRunningTime="2025-12-03 12:18:57.465831522 +0000 UTC m=+4521.301759986" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.299561 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d4szn"] Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.303605 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.315199 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4szn"] Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.427481 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxvtn\" (UniqueName: \"kubernetes.io/projected/cf75c2f6-d998-4086-a6be-936dc9c46629-kube-api-access-xxvtn\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.427598 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-catalog-content\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.427676 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-utilities\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.529425 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxvtn\" (UniqueName: \"kubernetes.io/projected/cf75c2f6-d998-4086-a6be-936dc9c46629-kube-api-access-xxvtn\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.529526 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-catalog-content\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.529579 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-utilities\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.530187 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-utilities\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.530772 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-catalog-content\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.555671 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxvtn\" (UniqueName: \"kubernetes.io/projected/cf75c2f6-d998-4086-a6be-936dc9c46629-kube-api-access-xxvtn\") pod \"redhat-marketplace-d4szn\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:02 crc kubenswrapper[4702]: I1203 12:19:02.632436 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:03 crc kubenswrapper[4702]: I1203 12:19:03.305654 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4szn"] Dec 03 12:19:03 crc kubenswrapper[4702]: I1203 12:19:03.526779 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4szn" event={"ID":"cf75c2f6-d998-4086-a6be-936dc9c46629","Type":"ContainerStarted","Data":"5b890d0985657dd42e1d5256c9ba90d1c1adb0d12c936793d8517d144622f5a3"} Dec 03 12:19:04 crc kubenswrapper[4702]: I1203 12:19:04.541620 4702 generic.go:334] "Generic (PLEG): container finished" podID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerID="6969128e8d8a8a6322c7cbe9ed4b09c843bd6303632eb229e83579b5fd500f3f" exitCode=0 Dec 03 12:19:04 crc kubenswrapper[4702]: I1203 12:19:04.542016 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4szn" event={"ID":"cf75c2f6-d998-4086-a6be-936dc9c46629","Type":"ContainerDied","Data":"6969128e8d8a8a6322c7cbe9ed4b09c843bd6303632eb229e83579b5fd500f3f"} Dec 03 12:19:06 crc kubenswrapper[4702]: I1203 12:19:06.610352 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4szn" event={"ID":"cf75c2f6-d998-4086-a6be-936dc9c46629","Type":"ContainerStarted","Data":"16b5ee4bee744df689a6ab0436ef495bf91a5890c0d85f76f287805ee0f175fd"} Dec 03 12:19:06 crc kubenswrapper[4702]: I1203 12:19:06.955394 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:19:06 crc kubenswrapper[4702]: I1203 12:19:06.955473 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:19:08 crc kubenswrapper[4702]: I1203 12:19:08.020824 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8lwjp" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="registry-server" probeResult="failure" output=< Dec 03 12:19:08 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:19:08 crc kubenswrapper[4702]: > Dec 03 12:19:09 crc kubenswrapper[4702]: I1203 12:19:09.646181 4702 generic.go:334] "Generic (PLEG): container finished" podID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerID="16b5ee4bee744df689a6ab0436ef495bf91a5890c0d85f76f287805ee0f175fd" exitCode=0 Dec 03 12:19:09 crc kubenswrapper[4702]: I1203 12:19:09.646261 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4szn" event={"ID":"cf75c2f6-d998-4086-a6be-936dc9c46629","Type":"ContainerDied","Data":"16b5ee4bee744df689a6ab0436ef495bf91a5890c0d85f76f287805ee0f175fd"} Dec 03 12:19:12 crc kubenswrapper[4702]: I1203 12:19:12.681376 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4szn" event={"ID":"cf75c2f6-d998-4086-a6be-936dc9c46629","Type":"ContainerStarted","Data":"610aeb5d20f8298c0632069674a15605ac9c01dd9032f331c97e690d38f381ae"} Dec 03 12:19:12 crc kubenswrapper[4702]: I1203 12:19:12.728363 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d4szn" podStartSLOduration=3.975695674 podStartE2EDuration="10.728245478s" podCreationTimestamp="2025-12-03 12:19:02 +0000 UTC" firstStartedPulling="2025-12-03 12:19:04.54545724 +0000 UTC m=+4528.381385704" lastFinishedPulling="2025-12-03 12:19:11.298007044 +0000 UTC m=+4535.133935508" observedRunningTime="2025-12-03 12:19:12.704638675 +0000 UTC m=+4536.540567159" watchObservedRunningTime="2025-12-03 12:19:12.728245478 +0000 UTC m=+4536.564173942" Dec 03 12:19:17 crc kubenswrapper[4702]: I1203 12:19:17.271122 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:19:17 crc kubenswrapper[4702]: I1203 12:19:17.333800 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:19:17 crc kubenswrapper[4702]: I1203 12:19:17.511574 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lwjp"] Dec 03 12:19:18 crc kubenswrapper[4702]: I1203 12:19:18.767628 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8lwjp" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="registry-server" containerID="cri-o://0df9fff55a53168ba8257a552fb60630f62990172198685e208bacfd23536fd7" gracePeriod=2 Dec 03 12:19:20 crc kubenswrapper[4702]: I1203 12:19:20.805286 4702 generic.go:334] "Generic (PLEG): container finished" podID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerID="0df9fff55a53168ba8257a552fb60630f62990172198685e208bacfd23536fd7" exitCode=0 Dec 03 12:19:20 crc kubenswrapper[4702]: I1203 12:19:20.805477 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lwjp" event={"ID":"2631b0b0-32bb-412a-9899-8b2e1325df36","Type":"ContainerDied","Data":"0df9fff55a53168ba8257a552fb60630f62990172198685e208bacfd23536fd7"} Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.031540 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.162470 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-utilities\") pod \"2631b0b0-32bb-412a-9899-8b2e1325df36\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.162647 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-catalog-content\") pod \"2631b0b0-32bb-412a-9899-8b2e1325df36\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.162793 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnmj8\" (UniqueName: \"kubernetes.io/projected/2631b0b0-32bb-412a-9899-8b2e1325df36-kube-api-access-bnmj8\") pod \"2631b0b0-32bb-412a-9899-8b2e1325df36\" (UID: \"2631b0b0-32bb-412a-9899-8b2e1325df36\") " Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.164647 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-utilities" (OuterVolumeSpecName: "utilities") pod "2631b0b0-32bb-412a-9899-8b2e1325df36" (UID: "2631b0b0-32bb-412a-9899-8b2e1325df36"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.168985 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2631b0b0-32bb-412a-9899-8b2e1325df36-kube-api-access-bnmj8" (OuterVolumeSpecName: "kube-api-access-bnmj8") pod "2631b0b0-32bb-412a-9899-8b2e1325df36" (UID: "2631b0b0-32bb-412a-9899-8b2e1325df36"). InnerVolumeSpecName "kube-api-access-bnmj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.266214 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.266542 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnmj8\" (UniqueName: \"kubernetes.io/projected/2631b0b0-32bb-412a-9899-8b2e1325df36-kube-api-access-bnmj8\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.278383 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2631b0b0-32bb-412a-9899-8b2e1325df36" (UID: "2631b0b0-32bb-412a-9899-8b2e1325df36"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.368630 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2631b0b0-32bb-412a-9899-8b2e1325df36-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.819714 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8lwjp" event={"ID":"2631b0b0-32bb-412a-9899-8b2e1325df36","Type":"ContainerDied","Data":"99cea139103d2f5f9496c1da2fb57844b73d0bc467c3ce87d83db29bc46db2e9"} Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.819793 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8lwjp" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.819798 4702 scope.go:117] "RemoveContainer" containerID="0df9fff55a53168ba8257a552fb60630f62990172198685e208bacfd23536fd7" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.862532 4702 scope.go:117] "RemoveContainer" containerID="b452b974a8b4305de3f187a0bcc976e70c0030023566aca7e430344238654d61" Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.865099 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8lwjp"] Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.877305 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8lwjp"] Dec 03 12:19:21 crc kubenswrapper[4702]: I1203 12:19:21.891734 4702 scope.go:117] "RemoveContainer" containerID="811261043cc19c1dcacdc8af2143c42d5b3e363ad440bc7f4bbae83bb57b510a" Dec 03 12:19:22 crc kubenswrapper[4702]: I1203 12:19:22.634108 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:22 crc kubenswrapper[4702]: I1203 12:19:22.635107 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:22 crc kubenswrapper[4702]: I1203 12:19:22.686090 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:22 crc kubenswrapper[4702]: I1203 12:19:22.896097 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:22 crc kubenswrapper[4702]: I1203 12:19:22.944819 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" path="/var/lib/kubelet/pods/2631b0b0-32bb-412a-9899-8b2e1325df36/volumes" Dec 03 12:19:23 crc kubenswrapper[4702]: I1203 12:19:23.283519 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4szn"] Dec 03 12:19:24 crc kubenswrapper[4702]: I1203 12:19:24.864043 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d4szn" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerName="registry-server" containerID="cri-o://610aeb5d20f8298c0632069674a15605ac9c01dd9032f331c97e690d38f381ae" gracePeriod=2 Dec 03 12:19:25 crc kubenswrapper[4702]: I1203 12:19:25.908625 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:19:25 crc kubenswrapper[4702]: I1203 12:19:25.909675 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:19:27 crc kubenswrapper[4702]: I1203 12:19:27.902109 4702 generic.go:334] "Generic (PLEG): container finished" podID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerID="610aeb5d20f8298c0632069674a15605ac9c01dd9032f331c97e690d38f381ae" exitCode=0 Dec 03 12:19:27 crc kubenswrapper[4702]: I1203 12:19:27.902207 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4szn" event={"ID":"cf75c2f6-d998-4086-a6be-936dc9c46629","Type":"ContainerDied","Data":"610aeb5d20f8298c0632069674a15605ac9c01dd9032f331c97e690d38f381ae"} Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.118834 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.255684 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-catalog-content\") pod \"cf75c2f6-d998-4086-a6be-936dc9c46629\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.256038 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-utilities\") pod \"cf75c2f6-d998-4086-a6be-936dc9c46629\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.256104 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxvtn\" (UniqueName: \"kubernetes.io/projected/cf75c2f6-d998-4086-a6be-936dc9c46629-kube-api-access-xxvtn\") pod \"cf75c2f6-d998-4086-a6be-936dc9c46629\" (UID: \"cf75c2f6-d998-4086-a6be-936dc9c46629\") " Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.257088 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-utilities" (OuterVolumeSpecName: "utilities") pod "cf75c2f6-d998-4086-a6be-936dc9c46629" (UID: "cf75c2f6-d998-4086-a6be-936dc9c46629"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.262266 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf75c2f6-d998-4086-a6be-936dc9c46629-kube-api-access-xxvtn" (OuterVolumeSpecName: "kube-api-access-xxvtn") pod "cf75c2f6-d998-4086-a6be-936dc9c46629" (UID: "cf75c2f6-d998-4086-a6be-936dc9c46629"). InnerVolumeSpecName "kube-api-access-xxvtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.288156 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf75c2f6-d998-4086-a6be-936dc9c46629" (UID: "cf75c2f6-d998-4086-a6be-936dc9c46629"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.358715 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.358754 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxvtn\" (UniqueName: \"kubernetes.io/projected/cf75c2f6-d998-4086-a6be-936dc9c46629-kube-api-access-xxvtn\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.358778 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf75c2f6-d998-4086-a6be-936dc9c46629-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.948486 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d4szn" event={"ID":"cf75c2f6-d998-4086-a6be-936dc9c46629","Type":"ContainerDied","Data":"5b890d0985657dd42e1d5256c9ba90d1c1adb0d12c936793d8517d144622f5a3"} Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.948553 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d4szn" Dec 03 12:19:30 crc kubenswrapper[4702]: I1203 12:19:30.948566 4702 scope.go:117] "RemoveContainer" containerID="610aeb5d20f8298c0632069674a15605ac9c01dd9032f331c97e690d38f381ae" Dec 03 12:19:31 crc kubenswrapper[4702]: I1203 12:19:31.001915 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4szn"] Dec 03 12:19:31 crc kubenswrapper[4702]: I1203 12:19:31.006686 4702 scope.go:117] "RemoveContainer" containerID="16b5ee4bee744df689a6ab0436ef495bf91a5890c0d85f76f287805ee0f175fd" Dec 03 12:19:31 crc kubenswrapper[4702]: I1203 12:19:31.019031 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d4szn"] Dec 03 12:19:31 crc kubenswrapper[4702]: I1203 12:19:31.038056 4702 scope.go:117] "RemoveContainer" containerID="6969128e8d8a8a6322c7cbe9ed4b09c843bd6303632eb229e83579b5fd500f3f" Dec 03 12:19:32 crc kubenswrapper[4702]: I1203 12:19:32.952722 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" path="/var/lib/kubelet/pods/cf75c2f6-d998-4086-a6be-936dc9c46629/volumes" Dec 03 12:19:55 crc kubenswrapper[4702]: I1203 12:19:55.908216 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:19:55 crc kubenswrapper[4702]: I1203 12:19:55.908835 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:20:21 crc kubenswrapper[4702]: I1203 12:20:21.013113 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:20:25 crc kubenswrapper[4702]: I1203 12:20:25.908748 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:20:25 crc kubenswrapper[4702]: I1203 12:20:25.909452 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:20:25 crc kubenswrapper[4702]: I1203 12:20:25.909506 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:20:25 crc kubenswrapper[4702]: I1203 12:20:25.910576 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:20:25 crc kubenswrapper[4702]: I1203 12:20:25.910630 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" gracePeriod=600 Dec 03 12:20:26 crc kubenswrapper[4702]: E1203 12:20:26.051821 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:20:26 crc kubenswrapper[4702]: I1203 12:20:26.921574 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" exitCode=0 Dec 03 12:20:26 crc kubenswrapper[4702]: I1203 12:20:26.921625 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6"} Dec 03 12:20:26 crc kubenswrapper[4702]: I1203 12:20:26.921668 4702 scope.go:117] "RemoveContainer" containerID="39f744a96b947ad71dac3cd2fa7cfeca49ebfdcba6909ec102d28a09fbaa619a" Dec 03 12:20:26 crc kubenswrapper[4702]: I1203 12:20:26.922867 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:20:26 crc kubenswrapper[4702]: E1203 12:20:26.923540 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:20:41 crc kubenswrapper[4702]: I1203 12:20:41.928316 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:20:41 crc kubenswrapper[4702]: E1203 12:20:41.929093 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:20:55 crc kubenswrapper[4702]: I1203 12:20:55.929312 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:20:55 crc kubenswrapper[4702]: E1203 12:20:55.930141 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:21:09 crc kubenswrapper[4702]: I1203 12:21:09.929206 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:21:09 crc kubenswrapper[4702]: E1203 12:21:09.929982 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:21:19 crc kubenswrapper[4702]: E1203 12:21:19.361112 4702 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.176:59020->38.102.83.176:40897: write tcp 38.102.83.176:59020->38.102.83.176:40897: write: broken pipe Dec 03 12:21:21 crc kubenswrapper[4702]: I1203 12:21:21.928360 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:21:21 crc kubenswrapper[4702]: E1203 12:21:21.929622 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:21:33 crc kubenswrapper[4702]: I1203 12:21:33.043734 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:21:33 crc kubenswrapper[4702]: E1203 12:21:33.046836 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:21:47 crc kubenswrapper[4702]: I1203 12:21:47.929960 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:21:47 crc kubenswrapper[4702]: E1203 12:21:47.930985 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:21:59 crc kubenswrapper[4702]: I1203 12:21:59.927991 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:21:59 crc kubenswrapper[4702]: E1203 12:21:59.928778 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:22:14 crc kubenswrapper[4702]: I1203 12:22:14.929700 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:22:14 crc kubenswrapper[4702]: E1203 12:22:14.930796 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:22:29 crc kubenswrapper[4702]: I1203 12:22:29.929004 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:22:29 crc kubenswrapper[4702]: E1203 12:22:29.930071 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:22:41 crc kubenswrapper[4702]: I1203 12:22:41.928998 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:22:41 crc kubenswrapper[4702]: E1203 12:22:41.929909 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:22:52 crc kubenswrapper[4702]: I1203 12:22:52.929030 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:22:52 crc kubenswrapper[4702]: E1203 12:22:52.930144 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.186941 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6jfj"] Dec 03 12:23:03 crc kubenswrapper[4702]: E1203 12:23:03.188140 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerName="extract-content" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.188159 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerName="extract-content" Dec 03 12:23:03 crc kubenswrapper[4702]: E1203 12:23:03.188202 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="registry-server" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.188210 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="registry-server" Dec 03 12:23:03 crc kubenswrapper[4702]: E1203 12:23:03.188235 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerName="registry-server" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.188242 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerName="registry-server" Dec 03 12:23:03 crc kubenswrapper[4702]: E1203 12:23:03.188277 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="extract-utilities" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.188286 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="extract-utilities" Dec 03 12:23:03 crc kubenswrapper[4702]: E1203 12:23:03.188299 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="extract-content" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.188307 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="extract-content" Dec 03 12:23:03 crc kubenswrapper[4702]: E1203 12:23:03.188325 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerName="extract-utilities" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.188332 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerName="extract-utilities" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.188717 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf75c2f6-d998-4086-a6be-936dc9c46629" containerName="registry-server" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.188746 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2631b0b0-32bb-412a-9899-8b2e1325df36" containerName="registry-server" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.191398 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.221731 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6jfj"] Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.315947 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-catalog-content\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.316012 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-utilities\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.316329 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72gf8\" (UniqueName: \"kubernetes.io/projected/4deb2eec-1439-4d9c-bfd8-e5303111aa49-kube-api-access-72gf8\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.420321 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72gf8\" (UniqueName: \"kubernetes.io/projected/4deb2eec-1439-4d9c-bfd8-e5303111aa49-kube-api-access-72gf8\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.420726 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-catalog-content\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.420888 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-utilities\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.421876 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-utilities\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.422704 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-catalog-content\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.462124 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72gf8\" (UniqueName: \"kubernetes.io/projected/4deb2eec-1439-4d9c-bfd8-e5303111aa49-kube-api-access-72gf8\") pod \"community-operators-m6jfj\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:03 crc kubenswrapper[4702]: I1203 12:23:03.518952 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:04 crc kubenswrapper[4702]: I1203 12:23:04.184096 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6jfj"] Dec 03 12:23:04 crc kubenswrapper[4702]: I1203 12:23:04.928785 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:23:04 crc kubenswrapper[4702]: E1203 12:23:04.929512 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:23:05 crc kubenswrapper[4702]: I1203 12:23:05.037071 4702 generic.go:334] "Generic (PLEG): container finished" podID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerID="3a0fc5282ee3fc54b8571f39b3b945e01af5a190e5c2174cca5b868e8bb83b15" exitCode=0 Dec 03 12:23:05 crc kubenswrapper[4702]: I1203 12:23:05.037125 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6jfj" event={"ID":"4deb2eec-1439-4d9c-bfd8-e5303111aa49","Type":"ContainerDied","Data":"3a0fc5282ee3fc54b8571f39b3b945e01af5a190e5c2174cca5b868e8bb83b15"} Dec 03 12:23:05 crc kubenswrapper[4702]: I1203 12:23:05.037156 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6jfj" event={"ID":"4deb2eec-1439-4d9c-bfd8-e5303111aa49","Type":"ContainerStarted","Data":"5ea345badcc1f9892e5dbb29a71ff420118f0ae11c8595522bf9a94facc2a843"} Dec 03 12:23:05 crc kubenswrapper[4702]: I1203 12:23:05.039309 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:23:05 crc kubenswrapper[4702]: E1203 12:23:05.609004 4702 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.176:53970->38.102.83.176:40897: write tcp 38.102.83.176:53970->38.102.83.176:40897: write: broken pipe Dec 03 12:23:07 crc kubenswrapper[4702]: I1203 12:23:07.063594 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6jfj" event={"ID":"4deb2eec-1439-4d9c-bfd8-e5303111aa49","Type":"ContainerStarted","Data":"c62bf528a00ddd82bf76bbfb817cd76d1fafba62d6d28e80db43609d2b69e272"} Dec 03 12:23:08 crc kubenswrapper[4702]: I1203 12:23:08.079787 4702 generic.go:334] "Generic (PLEG): container finished" podID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerID="c62bf528a00ddd82bf76bbfb817cd76d1fafba62d6d28e80db43609d2b69e272" exitCode=0 Dec 03 12:23:08 crc kubenswrapper[4702]: I1203 12:23:08.082723 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6jfj" event={"ID":"4deb2eec-1439-4d9c-bfd8-e5303111aa49","Type":"ContainerDied","Data":"c62bf528a00ddd82bf76bbfb817cd76d1fafba62d6d28e80db43609d2b69e272"} Dec 03 12:23:10 crc kubenswrapper[4702]: I1203 12:23:10.378303 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6jfj" event={"ID":"4deb2eec-1439-4d9c-bfd8-e5303111aa49","Type":"ContainerStarted","Data":"4cc7ddd11b40a7e68bd52b9fb03b20238e7d6932be5a1e468d9fb72759dd76ff"} Dec 03 12:23:10 crc kubenswrapper[4702]: I1203 12:23:10.408518 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6jfj" podStartSLOduration=3.547959603 podStartE2EDuration="7.408477418s" podCreationTimestamp="2025-12-03 12:23:03 +0000 UTC" firstStartedPulling="2025-12-03 12:23:05.038962331 +0000 UTC m=+4768.874890795" lastFinishedPulling="2025-12-03 12:23:08.899480156 +0000 UTC m=+4772.735408610" observedRunningTime="2025-12-03 12:23:10.407349126 +0000 UTC m=+4774.243277590" watchObservedRunningTime="2025-12-03 12:23:10.408477418 +0000 UTC m=+4774.244405882" Dec 03 12:23:13 crc kubenswrapper[4702]: I1203 12:23:13.519359 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:13 crc kubenswrapper[4702]: I1203 12:23:13.519701 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:13 crc kubenswrapper[4702]: I1203 12:23:13.576519 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:14 crc kubenswrapper[4702]: I1203 12:23:14.975021 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:15 crc kubenswrapper[4702]: I1203 12:23:15.035208 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6jfj"] Dec 03 12:23:16 crc kubenswrapper[4702]: I1203 12:23:16.446189 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m6jfj" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerName="registry-server" containerID="cri-o://4cc7ddd11b40a7e68bd52b9fb03b20238e7d6932be5a1e468d9fb72759dd76ff" gracePeriod=2 Dec 03 12:23:17 crc kubenswrapper[4702]: I1203 12:23:17.463939 4702 generic.go:334] "Generic (PLEG): container finished" podID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerID="4cc7ddd11b40a7e68bd52b9fb03b20238e7d6932be5a1e468d9fb72759dd76ff" exitCode=0 Dec 03 12:23:17 crc kubenswrapper[4702]: I1203 12:23:17.463996 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6jfj" event={"ID":"4deb2eec-1439-4d9c-bfd8-e5303111aa49","Type":"ContainerDied","Data":"4cc7ddd11b40a7e68bd52b9fb03b20238e7d6932be5a1e468d9fb72759dd76ff"} Dec 03 12:23:17 crc kubenswrapper[4702]: I1203 12:23:17.928954 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:23:17 crc kubenswrapper[4702]: E1203 12:23:17.929538 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.000568 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.138546 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-catalog-content\") pod \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.138734 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72gf8\" (UniqueName: \"kubernetes.io/projected/4deb2eec-1439-4d9c-bfd8-e5303111aa49-kube-api-access-72gf8\") pod \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.138890 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-utilities\") pod \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\" (UID: \"4deb2eec-1439-4d9c-bfd8-e5303111aa49\") " Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.140441 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-utilities" (OuterVolumeSpecName: "utilities") pod "4deb2eec-1439-4d9c-bfd8-e5303111aa49" (UID: "4deb2eec-1439-4d9c-bfd8-e5303111aa49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.146913 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4deb2eec-1439-4d9c-bfd8-e5303111aa49-kube-api-access-72gf8" (OuterVolumeSpecName: "kube-api-access-72gf8") pod "4deb2eec-1439-4d9c-bfd8-e5303111aa49" (UID: "4deb2eec-1439-4d9c-bfd8-e5303111aa49"). InnerVolumeSpecName "kube-api-access-72gf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.190978 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4deb2eec-1439-4d9c-bfd8-e5303111aa49" (UID: "4deb2eec-1439-4d9c-bfd8-e5303111aa49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.242452 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.242499 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72gf8\" (UniqueName: \"kubernetes.io/projected/4deb2eec-1439-4d9c-bfd8-e5303111aa49-kube-api-access-72gf8\") on node \"crc\" DevicePath \"\"" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.242511 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4deb2eec-1439-4d9c-bfd8-e5303111aa49-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.527357 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6jfj" event={"ID":"4deb2eec-1439-4d9c-bfd8-e5303111aa49","Type":"ContainerDied","Data":"5ea345badcc1f9892e5dbb29a71ff420118f0ae11c8595522bf9a94facc2a843"} Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.527492 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6jfj" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.527691 4702 scope.go:117] "RemoveContainer" containerID="4cc7ddd11b40a7e68bd52b9fb03b20238e7d6932be5a1e468d9fb72759dd76ff" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.568306 4702 scope.go:117] "RemoveContainer" containerID="c62bf528a00ddd82bf76bbfb817cd76d1fafba62d6d28e80db43609d2b69e272" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.575381 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6jfj"] Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.589268 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m6jfj"] Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.593595 4702 scope.go:117] "RemoveContainer" containerID="3a0fc5282ee3fc54b8571f39b3b945e01af5a190e5c2174cca5b868e8bb83b15" Dec 03 12:23:22 crc kubenswrapper[4702]: I1203 12:23:22.945512 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" path="/var/lib/kubelet/pods/4deb2eec-1439-4d9c-bfd8-e5303111aa49/volumes" Dec 03 12:23:29 crc kubenswrapper[4702]: I1203 12:23:29.929196 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:23:29 crc kubenswrapper[4702]: E1203 12:23:29.930137 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:23:41 crc kubenswrapper[4702]: I1203 12:23:41.930553 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:23:41 crc kubenswrapper[4702]: E1203 12:23:41.931386 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:23:43 crc kubenswrapper[4702]: I1203 12:23:43.060939 4702 trace.go:236] Trace[642191758]: "Calculate volume metrics of wal for pod openshift-logging/logging-loki-ingester-0" (03-Dec-2025 12:23:33.705) (total time: 9353ms): Dec 03 12:23:43 crc kubenswrapper[4702]: Trace[642191758]: [9.353982002s] [9.353982002s] END Dec 03 12:23:43 crc kubenswrapper[4702]: I1203 12:23:43.073448 4702 trace.go:236] Trace[1828519735]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-8p7q4" (03-Dec-2025 12:23:40.339) (total time: 2733ms): Dec 03 12:23:43 crc kubenswrapper[4702]: Trace[1828519735]: [2.733780655s] [2.733780655s] END Dec 03 12:23:43 crc kubenswrapper[4702]: I1203 12:23:43.080881 4702 trace.go:236] Trace[917977321]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-index-gateway-0" (03-Dec-2025 12:23:33.971) (total time: 9109ms): Dec 03 12:23:43 crc kubenswrapper[4702]: Trace[917977321]: [9.109519095s] [9.109519095s] END Dec 03 12:23:43 crc kubenswrapper[4702]: I1203 12:23:43.080956 4702 trace.go:236] Trace[458949438]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-compactor-0" (03-Dec-2025 12:23:40.757) (total time: 2323ms): Dec 03 12:23:43 crc kubenswrapper[4702]: Trace[458949438]: [2.323542326s] [2.323542326s] END Dec 03 12:23:52 crc kubenswrapper[4702]: I1203 12:23:52.929257 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:23:52 crc kubenswrapper[4702]: E1203 12:23:52.930189 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:24:05 crc kubenswrapper[4702]: I1203 12:24:05.929272 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:24:05 crc kubenswrapper[4702]: E1203 12:24:05.930100 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:24:20 crc kubenswrapper[4702]: I1203 12:24:20.963167 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:24:20 crc kubenswrapper[4702]: E1203 12:24:20.964090 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:24:32 crc kubenswrapper[4702]: I1203 12:24:32.928230 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:24:32 crc kubenswrapper[4702]: E1203 12:24:32.930213 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:24:44 crc kubenswrapper[4702]: I1203 12:24:44.035670 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:24:44 crc kubenswrapper[4702]: E1203 12:24:44.045658 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:24:57 crc kubenswrapper[4702]: I1203 12:24:57.929007 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:24:57 crc kubenswrapper[4702]: E1203 12:24:57.929897 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:25:09 crc kubenswrapper[4702]: I1203 12:25:09.928832 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:25:09 crc kubenswrapper[4702]: E1203 12:25:09.929890 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:25:22 crc kubenswrapper[4702]: I1203 12:25:22.929067 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:25:22 crc kubenswrapper[4702]: E1203 12:25:22.929929 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:25:35 crc kubenswrapper[4702]: I1203 12:25:35.928879 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:25:37 crc kubenswrapper[4702]: I1203 12:25:37.210359 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"0de56d455f6a25d82d6775c8599fe924b6ccfb032b2e015fb228ca76142f5944"} Dec 03 12:27:53 crc kubenswrapper[4702]: I1203 12:27:53.301472 4702 trace.go:236] Trace[290717744]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-compactor-0" (03-Dec-2025 12:27:48.319) (total time: 4982ms): Dec 03 12:27:53 crc kubenswrapper[4702]: Trace[290717744]: [4.982400281s] [4.982400281s] END Dec 03 12:27:53 crc kubenswrapper[4702]: I1203 12:27:53.375591 4702 trace.go:236] Trace[126714442]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-8p7q4" (03-Dec-2025 12:27:50.160) (total time: 3214ms): Dec 03 12:27:53 crc kubenswrapper[4702]: Trace[126714442]: [3.214982026s] [3.214982026s] END Dec 03 12:27:55 crc kubenswrapper[4702]: I1203 12:27:55.911289 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:27:55 crc kubenswrapper[4702]: I1203 12:27:55.911565 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:27:58 crc kubenswrapper[4702]: I1203 12:27:58.532481 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 12:28:25 crc kubenswrapper[4702]: I1203 12:28:25.908139 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:28:25 crc kubenswrapper[4702]: I1203 12:28:25.908818 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.843093 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j57b5"] Dec 03 12:28:26 crc kubenswrapper[4702]: E1203 12:28:26.844959 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerName="extract-content" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.845198 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerName="extract-content" Dec 03 12:28:26 crc kubenswrapper[4702]: E1203 12:28:26.846102 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerName="registry-server" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.846130 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerName="registry-server" Dec 03 12:28:26 crc kubenswrapper[4702]: E1203 12:28:26.846186 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerName="extract-utilities" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.846193 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerName="extract-utilities" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.846781 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4deb2eec-1439-4d9c-bfd8-e5303111aa49" containerName="registry-server" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.848688 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.861802 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j57b5"] Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.932045 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-catalog-content\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.932263 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-utilities\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:26 crc kubenswrapper[4702]: I1203 12:28:26.932728 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxzws\" (UniqueName: \"kubernetes.io/projected/11abc2e1-9333-4cac-85d7-20ac907fad69-kube-api-access-sxzws\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:27 crc kubenswrapper[4702]: I1203 12:28:27.035165 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-utilities\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:27 crc kubenswrapper[4702]: I1203 12:28:27.035827 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-utilities\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:27 crc kubenswrapper[4702]: I1203 12:28:27.036841 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxzws\" (UniqueName: \"kubernetes.io/projected/11abc2e1-9333-4cac-85d7-20ac907fad69-kube-api-access-sxzws\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:27 crc kubenswrapper[4702]: I1203 12:28:27.036982 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-catalog-content\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:27 crc kubenswrapper[4702]: I1203 12:28:27.037462 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-catalog-content\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:27 crc kubenswrapper[4702]: I1203 12:28:27.064782 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxzws\" (UniqueName: \"kubernetes.io/projected/11abc2e1-9333-4cac-85d7-20ac907fad69-kube-api-access-sxzws\") pod \"certified-operators-j57b5\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:27 crc kubenswrapper[4702]: I1203 12:28:27.181988 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:27 crc kubenswrapper[4702]: I1203 12:28:27.777099 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j57b5"] Dec 03 12:28:28 crc kubenswrapper[4702]: I1203 12:28:28.074837 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j57b5" event={"ID":"11abc2e1-9333-4cac-85d7-20ac907fad69","Type":"ContainerStarted","Data":"9c557eac3593d7b9b0844a37038332512cdabf4cb72cf517c5d210a1090aff30"} Dec 03 12:28:29 crc kubenswrapper[4702]: I1203 12:28:29.088438 4702 generic.go:334] "Generic (PLEG): container finished" podID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerID="d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da" exitCode=0 Dec 03 12:28:29 crc kubenswrapper[4702]: I1203 12:28:29.088538 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j57b5" event={"ID":"11abc2e1-9333-4cac-85d7-20ac907fad69","Type":"ContainerDied","Data":"d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da"} Dec 03 12:28:29 crc kubenswrapper[4702]: I1203 12:28:29.091400 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:28:30 crc kubenswrapper[4702]: I1203 12:28:30.106378 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j57b5" event={"ID":"11abc2e1-9333-4cac-85d7-20ac907fad69","Type":"ContainerStarted","Data":"0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299"} Dec 03 12:28:31 crc kubenswrapper[4702]: I1203 12:28:31.330249 4702 generic.go:334] "Generic (PLEG): container finished" podID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerID="0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299" exitCode=0 Dec 03 12:28:31 crc kubenswrapper[4702]: I1203 12:28:31.330572 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j57b5" event={"ID":"11abc2e1-9333-4cac-85d7-20ac907fad69","Type":"ContainerDied","Data":"0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299"} Dec 03 12:28:33 crc kubenswrapper[4702]: I1203 12:28:33.355276 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j57b5" event={"ID":"11abc2e1-9333-4cac-85d7-20ac907fad69","Type":"ContainerStarted","Data":"d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b"} Dec 03 12:28:33 crc kubenswrapper[4702]: I1203 12:28:33.380619 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j57b5" podStartSLOduration=4.106022248 podStartE2EDuration="7.380579613s" podCreationTimestamp="2025-12-03 12:28:26 +0000 UTC" firstStartedPulling="2025-12-03 12:28:29.091101077 +0000 UTC m=+5092.927029541" lastFinishedPulling="2025-12-03 12:28:32.365658442 +0000 UTC m=+5096.201586906" observedRunningTime="2025-12-03 12:28:33.378670909 +0000 UTC m=+5097.214599393" watchObservedRunningTime="2025-12-03 12:28:33.380579613 +0000 UTC m=+5097.216508077" Dec 03 12:28:37 crc kubenswrapper[4702]: I1203 12:28:37.184235 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:37 crc kubenswrapper[4702]: I1203 12:28:37.184950 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:37 crc kubenswrapper[4702]: I1203 12:28:37.245992 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:37 crc kubenswrapper[4702]: I1203 12:28:37.706641 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:38 crc kubenswrapper[4702]: I1203 12:28:38.017471 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j57b5"] Dec 03 12:28:39 crc kubenswrapper[4702]: I1203 12:28:39.437614 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j57b5" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerName="registry-server" containerID="cri-o://d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b" gracePeriod=2 Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.076304 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.192873 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-utilities\") pod \"11abc2e1-9333-4cac-85d7-20ac907fad69\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.193146 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-catalog-content\") pod \"11abc2e1-9333-4cac-85d7-20ac907fad69\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.193519 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxzws\" (UniqueName: \"kubernetes.io/projected/11abc2e1-9333-4cac-85d7-20ac907fad69-kube-api-access-sxzws\") pod \"11abc2e1-9333-4cac-85d7-20ac907fad69\" (UID: \"11abc2e1-9333-4cac-85d7-20ac907fad69\") " Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.195978 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-utilities" (OuterVolumeSpecName: "utilities") pod "11abc2e1-9333-4cac-85d7-20ac907fad69" (UID: "11abc2e1-9333-4cac-85d7-20ac907fad69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.202689 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11abc2e1-9333-4cac-85d7-20ac907fad69-kube-api-access-sxzws" (OuterVolumeSpecName: "kube-api-access-sxzws") pod "11abc2e1-9333-4cac-85d7-20ac907fad69" (UID: "11abc2e1-9333-4cac-85d7-20ac907fad69"). InnerVolumeSpecName "kube-api-access-sxzws". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.248172 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11abc2e1-9333-4cac-85d7-20ac907fad69" (UID: "11abc2e1-9333-4cac-85d7-20ac907fad69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.296666 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.296716 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11abc2e1-9333-4cac-85d7-20ac907fad69-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.296734 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxzws\" (UniqueName: \"kubernetes.io/projected/11abc2e1-9333-4cac-85d7-20ac907fad69-kube-api-access-sxzws\") on node \"crc\" DevicePath \"\"" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.452073 4702 generic.go:334] "Generic (PLEG): container finished" podID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerID="d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b" exitCode=0 Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.452202 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j57b5" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.452208 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j57b5" event={"ID":"11abc2e1-9333-4cac-85d7-20ac907fad69","Type":"ContainerDied","Data":"d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b"} Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.452591 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j57b5" event={"ID":"11abc2e1-9333-4cac-85d7-20ac907fad69","Type":"ContainerDied","Data":"9c557eac3593d7b9b0844a37038332512cdabf4cb72cf517c5d210a1090aff30"} Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.452615 4702 scope.go:117] "RemoveContainer" containerID="d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.500231 4702 scope.go:117] "RemoveContainer" containerID="0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.508985 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j57b5"] Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.523657 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j57b5"] Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.533596 4702 scope.go:117] "RemoveContainer" containerID="d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.601706 4702 scope.go:117] "RemoveContainer" containerID="d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b" Dec 03 12:28:40 crc kubenswrapper[4702]: E1203 12:28:40.605252 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b\": container with ID starting with d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b not found: ID does not exist" containerID="d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.605307 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b"} err="failed to get container status \"d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b\": rpc error: code = NotFound desc = could not find container \"d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b\": container with ID starting with d190842bbfe5a6ab9aee17d651260ddd314d72727417f3bb62d170fe7651622b not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.605344 4702 scope.go:117] "RemoveContainer" containerID="0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299" Dec 03 12:28:40 crc kubenswrapper[4702]: E1203 12:28:40.612534 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299\": container with ID starting with 0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299 not found: ID does not exist" containerID="0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.612723 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299"} err="failed to get container status \"0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299\": rpc error: code = NotFound desc = could not find container \"0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299\": container with ID starting with 0201a47f502c144dff7747d9428ee45c8b2283a204928b51d0a07df4d5393299 not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.613024 4702 scope.go:117] "RemoveContainer" containerID="d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da" Dec 03 12:28:40 crc kubenswrapper[4702]: E1203 12:28:40.620891 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da\": container with ID starting with d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da not found: ID does not exist" containerID="d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.620974 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da"} err="failed to get container status \"d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da\": rpc error: code = NotFound desc = could not find container \"d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da\": container with ID starting with d92d57161457fee1294bdc2fd4f53ac4aa84ccefe00654437ffb2b57503392da not found: ID does not exist" Dec 03 12:28:40 crc kubenswrapper[4702]: I1203 12:28:40.943628 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" path="/var/lib/kubelet/pods/11abc2e1-9333-4cac-85d7-20ac907fad69/volumes" Dec 03 12:28:48 crc kubenswrapper[4702]: E1203 12:28:48.155683 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 03 12:28:48 crc kubenswrapper[4702]: E1203 12:28:48.156356 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Dec 03 12:28:55 crc kubenswrapper[4702]: I1203 12:28:55.908461 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:28:55 crc kubenswrapper[4702]: I1203 12:28:55.909066 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:28:55 crc kubenswrapper[4702]: I1203 12:28:55.909124 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:28:55 crc kubenswrapper[4702]: I1203 12:28:55.910242 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0de56d455f6a25d82d6775c8599fe924b6ccfb032b2e015fb228ca76142f5944"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:28:55 crc kubenswrapper[4702]: I1203 12:28:55.910301 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://0de56d455f6a25d82d6775c8599fe924b6ccfb032b2e015fb228ca76142f5944" gracePeriod=600 Dec 03 12:28:56 crc kubenswrapper[4702]: I1203 12:28:56.772398 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="0de56d455f6a25d82d6775c8599fe924b6ccfb032b2e015fb228ca76142f5944" exitCode=0 Dec 03 12:28:56 crc kubenswrapper[4702]: I1203 12:28:56.772601 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"0de56d455f6a25d82d6775c8599fe924b6ccfb032b2e015fb228ca76142f5944"} Dec 03 12:28:56 crc kubenswrapper[4702]: I1203 12:28:56.773144 4702 scope.go:117] "RemoveContainer" containerID="9aa7d603ae03f88d77a4669d33afbe32095bd87459890f980fbfef690b5dd4c6" Dec 03 12:28:57 crc kubenswrapper[4702]: I1203 12:28:57.798151 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42"} Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.193188 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7pjcs"] Dec 03 12:29:06 crc kubenswrapper[4702]: E1203 12:29:06.194433 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerName="extract-content" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.194452 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerName="extract-content" Dec 03 12:29:06 crc kubenswrapper[4702]: E1203 12:29:06.194486 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerName="registry-server" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.194494 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerName="registry-server" Dec 03 12:29:06 crc kubenswrapper[4702]: E1203 12:29:06.194532 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerName="extract-utilities" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.194542 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerName="extract-utilities" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.194882 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="11abc2e1-9333-4cac-85d7-20ac907fad69" containerName="registry-server" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.197191 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.219527 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pjcs"] Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.235917 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-catalog-content\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.236013 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7fj\" (UniqueName: \"kubernetes.io/projected/b6f17b6d-803b-4c47-aec7-20642d228fe9-kube-api-access-np7fj\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.236600 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-utilities\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.339416 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np7fj\" (UniqueName: \"kubernetes.io/projected/b6f17b6d-803b-4c47-aec7-20642d228fe9-kube-api-access-np7fj\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.339671 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-utilities\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.339821 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-catalog-content\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.340425 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-catalog-content\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.340500 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-utilities\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.372708 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np7fj\" (UniqueName: \"kubernetes.io/projected/b6f17b6d-803b-4c47-aec7-20642d228fe9-kube-api-access-np7fj\") pod \"redhat-operators-7pjcs\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:06 crc kubenswrapper[4702]: I1203 12:29:06.528532 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:07 crc kubenswrapper[4702]: I1203 12:29:07.043208 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pjcs"] Dec 03 12:29:07 crc kubenswrapper[4702]: I1203 12:29:07.972496 4702 generic.go:334] "Generic (PLEG): container finished" podID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerID="ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633" exitCode=0 Dec 03 12:29:07 crc kubenswrapper[4702]: I1203 12:29:07.972586 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pjcs" event={"ID":"b6f17b6d-803b-4c47-aec7-20642d228fe9","Type":"ContainerDied","Data":"ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633"} Dec 03 12:29:07 crc kubenswrapper[4702]: I1203 12:29:07.973045 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pjcs" event={"ID":"b6f17b6d-803b-4c47-aec7-20642d228fe9","Type":"ContainerStarted","Data":"7db5a747ad629eab5462d7c2514c9c2053ef8db0edcef01d01baaa6da4b875ff"} Dec 03 12:29:10 crc kubenswrapper[4702]: I1203 12:29:10.654167 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pjcs" event={"ID":"b6f17b6d-803b-4c47-aec7-20642d228fe9","Type":"ContainerStarted","Data":"8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab"} Dec 03 12:29:14 crc kubenswrapper[4702]: I1203 12:29:14.936638 4702 generic.go:334] "Generic (PLEG): container finished" podID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerID="8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab" exitCode=0 Dec 03 12:29:14 crc kubenswrapper[4702]: I1203 12:29:14.955830 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pjcs" event={"ID":"b6f17b6d-803b-4c47-aec7-20642d228fe9","Type":"ContainerDied","Data":"8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab"} Dec 03 12:29:17 crc kubenswrapper[4702]: I1203 12:29:17.004207 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pjcs" event={"ID":"b6f17b6d-803b-4c47-aec7-20642d228fe9","Type":"ContainerStarted","Data":"a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53"} Dec 03 12:29:17 crc kubenswrapper[4702]: I1203 12:29:17.035295 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7pjcs" podStartSLOduration=3.370014801 podStartE2EDuration="11.035248636s" podCreationTimestamp="2025-12-03 12:29:06 +0000 UTC" firstStartedPulling="2025-12-03 12:29:07.974370715 +0000 UTC m=+5131.810299179" lastFinishedPulling="2025-12-03 12:29:15.63960455 +0000 UTC m=+5139.475533014" observedRunningTime="2025-12-03 12:29:17.026041823 +0000 UTC m=+5140.861970307" watchObservedRunningTime="2025-12-03 12:29:17.035248636 +0000 UTC m=+5140.871177100" Dec 03 12:29:26 crc kubenswrapper[4702]: I1203 12:29:26.529664 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:26 crc kubenswrapper[4702]: I1203 12:29:26.531343 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:27 crc kubenswrapper[4702]: I1203 12:29:27.594488 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7pjcs" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="registry-server" probeResult="failure" output=< Dec 03 12:29:27 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:29:27 crc kubenswrapper[4702]: > Dec 03 12:29:36 crc kubenswrapper[4702]: I1203 12:29:36.805218 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:36 crc kubenswrapper[4702]: I1203 12:29:36.868056 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:37 crc kubenswrapper[4702]: I1203 12:29:37.394194 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pjcs"] Dec 03 12:29:38 crc kubenswrapper[4702]: I1203 12:29:38.311016 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7pjcs" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="registry-server" containerID="cri-o://a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53" gracePeriod=2 Dec 03 12:29:38 crc kubenswrapper[4702]: I1203 12:29:38.908188 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:38 crc kubenswrapper[4702]: I1203 12:29:38.940109 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-utilities\") pod \"b6f17b6d-803b-4c47-aec7-20642d228fe9\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " Dec 03 12:29:38 crc kubenswrapper[4702]: I1203 12:29:38.940358 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-catalog-content\") pod \"b6f17b6d-803b-4c47-aec7-20642d228fe9\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " Dec 03 12:29:38 crc kubenswrapper[4702]: I1203 12:29:38.940459 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7fj\" (UniqueName: \"kubernetes.io/projected/b6f17b6d-803b-4c47-aec7-20642d228fe9-kube-api-access-np7fj\") pod \"b6f17b6d-803b-4c47-aec7-20642d228fe9\" (UID: \"b6f17b6d-803b-4c47-aec7-20642d228fe9\") " Dec 03 12:29:38 crc kubenswrapper[4702]: I1203 12:29:38.941569 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-utilities" (OuterVolumeSpecName: "utilities") pod "b6f17b6d-803b-4c47-aec7-20642d228fe9" (UID: "b6f17b6d-803b-4c47-aec7-20642d228fe9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:29:38 crc kubenswrapper[4702]: I1203 12:29:38.968610 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f17b6d-803b-4c47-aec7-20642d228fe9-kube-api-access-np7fj" (OuterVolumeSpecName: "kube-api-access-np7fj") pod "b6f17b6d-803b-4c47-aec7-20642d228fe9" (UID: "b6f17b6d-803b-4c47-aec7-20642d228fe9"). InnerVolumeSpecName "kube-api-access-np7fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.044811 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.044846 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7fj\" (UniqueName: \"kubernetes.io/projected/b6f17b6d-803b-4c47-aec7-20642d228fe9-kube-api-access-np7fj\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.084982 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6f17b6d-803b-4c47-aec7-20642d228fe9" (UID: "b6f17b6d-803b-4c47-aec7-20642d228fe9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.148570 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f17b6d-803b-4c47-aec7-20642d228fe9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.324077 4702 generic.go:334] "Generic (PLEG): container finished" podID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerID="a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53" exitCode=0 Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.324133 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pjcs" event={"ID":"b6f17b6d-803b-4c47-aec7-20642d228fe9","Type":"ContainerDied","Data":"a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53"} Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.324146 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pjcs" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.324176 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pjcs" event={"ID":"b6f17b6d-803b-4c47-aec7-20642d228fe9","Type":"ContainerDied","Data":"7db5a747ad629eab5462d7c2514c9c2053ef8db0edcef01d01baaa6da4b875ff"} Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.324204 4702 scope.go:117] "RemoveContainer" containerID="a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.364034 4702 scope.go:117] "RemoveContainer" containerID="8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.370585 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pjcs"] Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.393880 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7pjcs"] Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.421215 4702 scope.go:117] "RemoveContainer" containerID="ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.468536 4702 scope.go:117] "RemoveContainer" containerID="a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53" Dec 03 12:29:39 crc kubenswrapper[4702]: E1203 12:29:39.469815 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53\": container with ID starting with a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53 not found: ID does not exist" containerID="a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.469871 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53"} err="failed to get container status \"a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53\": rpc error: code = NotFound desc = could not find container \"a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53\": container with ID starting with a8efaa0da3c296ed5a43b16d4c1fcef9c3daec5bdcbaa043708799df2515cd53 not found: ID does not exist" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.469907 4702 scope.go:117] "RemoveContainer" containerID="8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab" Dec 03 12:29:39 crc kubenswrapper[4702]: E1203 12:29:39.470615 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab\": container with ID starting with 8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab not found: ID does not exist" containerID="8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.470657 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab"} err="failed to get container status \"8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab\": rpc error: code = NotFound desc = could not find container \"8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab\": container with ID starting with 8f3c08da7c66ab2966884a74178ac4e355e8c9316df5326abbbb4ab9928ef6ab not found: ID does not exist" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.470676 4702 scope.go:117] "RemoveContainer" containerID="ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633" Dec 03 12:29:39 crc kubenswrapper[4702]: E1203 12:29:39.470958 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633\": container with ID starting with ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633 not found: ID does not exist" containerID="ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633" Dec 03 12:29:39 crc kubenswrapper[4702]: I1203 12:29:39.470985 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633"} err="failed to get container status \"ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633\": rpc error: code = NotFound desc = could not find container \"ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633\": container with ID starting with ad435074f3c3c7197209e05db63eef1360b954f600392c6cb7f9fe4198678633 not found: ID does not exist" Dec 03 12:29:40 crc kubenswrapper[4702]: I1203 12:29:40.952383 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" path="/var/lib/kubelet/pods/b6f17b6d-803b-4c47-aec7-20642d228fe9/volumes" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.017517 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lns5d"] Dec 03 12:29:56 crc kubenswrapper[4702]: E1203 12:29:56.019413 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="extract-utilities" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.019433 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="extract-utilities" Dec 03 12:29:56 crc kubenswrapper[4702]: E1203 12:29:56.019469 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="registry-server" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.019477 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="registry-server" Dec 03 12:29:56 crc kubenswrapper[4702]: E1203 12:29:56.019503 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="extract-content" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.019511 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="extract-content" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.019833 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f17b6d-803b-4c47-aec7-20642d228fe9" containerName="registry-server" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.024118 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.036891 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lns5d"] Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.156311 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-utilities\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.156702 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7fw4\" (UniqueName: \"kubernetes.io/projected/440e3a18-bdd4-4bc3-a084-8c429802d2d5-kube-api-access-g7fw4\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.156734 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-catalog-content\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.259106 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7fw4\" (UniqueName: \"kubernetes.io/projected/440e3a18-bdd4-4bc3-a084-8c429802d2d5-kube-api-access-g7fw4\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.259528 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-catalog-content\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.259828 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-utilities\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.260395 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-catalog-content\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.260444 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-utilities\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.282464 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7fw4\" (UniqueName: \"kubernetes.io/projected/440e3a18-bdd4-4bc3-a084-8c429802d2d5-kube-api-access-g7fw4\") pod \"redhat-marketplace-lns5d\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:56 crc kubenswrapper[4702]: I1203 12:29:56.348537 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:29:57 crc kubenswrapper[4702]: I1203 12:29:57.192337 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lns5d"] Dec 03 12:29:57 crc kubenswrapper[4702]: I1203 12:29:57.798359 4702 generic.go:334] "Generic (PLEG): container finished" podID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerID="578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac" exitCode=0 Dec 03 12:29:57 crc kubenswrapper[4702]: I1203 12:29:57.798447 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lns5d" event={"ID":"440e3a18-bdd4-4bc3-a084-8c429802d2d5","Type":"ContainerDied","Data":"578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac"} Dec 03 12:29:57 crc kubenswrapper[4702]: I1203 12:29:57.798900 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lns5d" event={"ID":"440e3a18-bdd4-4bc3-a084-8c429802d2d5","Type":"ContainerStarted","Data":"c0083cef0a0808c5191a6587443040bd82d0bf5e30a81d74d9e62e84d5a4f3c5"} Dec 03 12:29:58 crc kubenswrapper[4702]: I1203 12:29:58.814753 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lns5d" event={"ID":"440e3a18-bdd4-4bc3-a084-8c429802d2d5","Type":"ContainerStarted","Data":"ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1"} Dec 03 12:29:59 crc kubenswrapper[4702]: I1203 12:29:59.827778 4702 generic.go:334] "Generic (PLEG): container finished" podID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerID="ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1" exitCode=0 Dec 03 12:29:59 crc kubenswrapper[4702]: I1203 12:29:59.827825 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lns5d" event={"ID":"440e3a18-bdd4-4bc3-a084-8c429802d2d5","Type":"ContainerDied","Data":"ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1"} Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.162849 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l"] Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.164825 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.168020 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.174520 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.185565 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l"] Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.318679 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5s5t\" (UniqueName: \"kubernetes.io/projected/75744f67-47a4-49d7-9465-aacdea59960f-kube-api-access-f5s5t\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.319584 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75744f67-47a4-49d7-9465-aacdea59960f-secret-volume\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.319870 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75744f67-47a4-49d7-9465-aacdea59960f-config-volume\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.421187 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5s5t\" (UniqueName: \"kubernetes.io/projected/75744f67-47a4-49d7-9465-aacdea59960f-kube-api-access-f5s5t\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.421367 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75744f67-47a4-49d7-9465-aacdea59960f-secret-volume\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.421422 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75744f67-47a4-49d7-9465-aacdea59960f-config-volume\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.422569 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75744f67-47a4-49d7-9465-aacdea59960f-config-volume\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.428651 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75744f67-47a4-49d7-9465-aacdea59960f-secret-volume\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.441144 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5s5t\" (UniqueName: \"kubernetes.io/projected/75744f67-47a4-49d7-9465-aacdea59960f-kube-api-access-f5s5t\") pod \"collect-profiles-29412750-nxr8l\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:00 crc kubenswrapper[4702]: I1203 12:30:00.490426 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:01 crc kubenswrapper[4702]: I1203 12:30:01.059667 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l"] Dec 03 12:30:01 crc kubenswrapper[4702]: W1203 12:30:01.065949 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75744f67_47a4_49d7_9465_aacdea59960f.slice/crio-10f84cfe203331ee7234fc5ed283f593563479ea4c137f2c631cf95f92d0c027 WatchSource:0}: Error finding container 10f84cfe203331ee7234fc5ed283f593563479ea4c137f2c631cf95f92d0c027: Status 404 returned error can't find the container with id 10f84cfe203331ee7234fc5ed283f593563479ea4c137f2c631cf95f92d0c027 Dec 03 12:30:01 crc kubenswrapper[4702]: I1203 12:30:01.886134 4702 generic.go:334] "Generic (PLEG): container finished" podID="75744f67-47a4-49d7-9465-aacdea59960f" containerID="77e7f87a4987e3def07ca06ba41b4c3154c6e1b4947b9c6c2eb0f6dae7c6ec5f" exitCode=0 Dec 03 12:30:01 crc kubenswrapper[4702]: I1203 12:30:01.886838 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" event={"ID":"75744f67-47a4-49d7-9465-aacdea59960f","Type":"ContainerDied","Data":"77e7f87a4987e3def07ca06ba41b4c3154c6e1b4947b9c6c2eb0f6dae7c6ec5f"} Dec 03 12:30:01 crc kubenswrapper[4702]: I1203 12:30:01.886935 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" event={"ID":"75744f67-47a4-49d7-9465-aacdea59960f","Type":"ContainerStarted","Data":"10f84cfe203331ee7234fc5ed283f593563479ea4c137f2c631cf95f92d0c027"} Dec 03 12:30:01 crc kubenswrapper[4702]: I1203 12:30:01.890622 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lns5d" event={"ID":"440e3a18-bdd4-4bc3-a084-8c429802d2d5","Type":"ContainerStarted","Data":"68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02"} Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.565028 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.591627 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lns5d" podStartSLOduration=4.855065786 podStartE2EDuration="8.591602323s" podCreationTimestamp="2025-12-03 12:29:55 +0000 UTC" firstStartedPulling="2025-12-03 12:29:57.801200996 +0000 UTC m=+5181.637129460" lastFinishedPulling="2025-12-03 12:30:01.537737533 +0000 UTC m=+5185.373665997" observedRunningTime="2025-12-03 12:30:01.932222761 +0000 UTC m=+5185.768151225" watchObservedRunningTime="2025-12-03 12:30:03.591602323 +0000 UTC m=+5187.427530787" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.729419 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75744f67-47a4-49d7-9465-aacdea59960f-config-volume\") pod \"75744f67-47a4-49d7-9465-aacdea59960f\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.729586 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5s5t\" (UniqueName: \"kubernetes.io/projected/75744f67-47a4-49d7-9465-aacdea59960f-kube-api-access-f5s5t\") pod \"75744f67-47a4-49d7-9465-aacdea59960f\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.729812 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75744f67-47a4-49d7-9465-aacdea59960f-secret-volume\") pod \"75744f67-47a4-49d7-9465-aacdea59960f\" (UID: \"75744f67-47a4-49d7-9465-aacdea59960f\") " Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.730423 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75744f67-47a4-49d7-9465-aacdea59960f-config-volume" (OuterVolumeSpecName: "config-volume") pod "75744f67-47a4-49d7-9465-aacdea59960f" (UID: "75744f67-47a4-49d7-9465-aacdea59960f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.732682 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75744f67-47a4-49d7-9465-aacdea59960f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.738969 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75744f67-47a4-49d7-9465-aacdea59960f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75744f67-47a4-49d7-9465-aacdea59960f" (UID: "75744f67-47a4-49d7-9465-aacdea59960f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.741185 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75744f67-47a4-49d7-9465-aacdea59960f-kube-api-access-f5s5t" (OuterVolumeSpecName: "kube-api-access-f5s5t") pod "75744f67-47a4-49d7-9465-aacdea59960f" (UID: "75744f67-47a4-49d7-9465-aacdea59960f"). InnerVolumeSpecName "kube-api-access-f5s5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.835472 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5s5t\" (UniqueName: \"kubernetes.io/projected/75744f67-47a4-49d7-9465-aacdea59960f-kube-api-access-f5s5t\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.835515 4702 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75744f67-47a4-49d7-9465-aacdea59960f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.919322 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" event={"ID":"75744f67-47a4-49d7-9465-aacdea59960f","Type":"ContainerDied","Data":"10f84cfe203331ee7234fc5ed283f593563479ea4c137f2c631cf95f92d0c027"} Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.919375 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10f84cfe203331ee7234fc5ed283f593563479ea4c137f2c631cf95f92d0c027" Dec 03 12:30:03 crc kubenswrapper[4702]: I1203 12:30:03.919445 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412750-nxr8l" Dec 03 12:30:04 crc kubenswrapper[4702]: I1203 12:30:04.665978 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5"] Dec 03 12:30:04 crc kubenswrapper[4702]: I1203 12:30:04.680303 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412705-zsdl5"] Dec 03 12:30:04 crc kubenswrapper[4702]: I1203 12:30:04.945505 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10dc69dd-f84f-456f-b7af-6289897508f9" path="/var/lib/kubelet/pods/10dc69dd-f84f-456f-b7af-6289897508f9/volumes" Dec 03 12:30:06 crc kubenswrapper[4702]: I1203 12:30:06.349413 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:30:06 crc kubenswrapper[4702]: I1203 12:30:06.349478 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:30:06 crc kubenswrapper[4702]: I1203 12:30:06.427901 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:30:07 crc kubenswrapper[4702]: I1203 12:30:07.033683 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:30:07 crc kubenswrapper[4702]: I1203 12:30:07.421701 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lns5d"] Dec 03 12:30:08 crc kubenswrapper[4702]: I1203 12:30:08.997092 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lns5d" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerName="registry-server" containerID="cri-o://68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02" gracePeriod=2 Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.647856 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.709441 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-utilities\") pod \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.710949 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-utilities" (OuterVolumeSpecName: "utilities") pod "440e3a18-bdd4-4bc3-a084-8c429802d2d5" (UID: "440e3a18-bdd4-4bc3-a084-8c429802d2d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.811058 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7fw4\" (UniqueName: \"kubernetes.io/projected/440e3a18-bdd4-4bc3-a084-8c429802d2d5-kube-api-access-g7fw4\") pod \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.811121 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-catalog-content\") pod \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\" (UID: \"440e3a18-bdd4-4bc3-a084-8c429802d2d5\") " Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.812061 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.819825 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/440e3a18-bdd4-4bc3-a084-8c429802d2d5-kube-api-access-g7fw4" (OuterVolumeSpecName: "kube-api-access-g7fw4") pod "440e3a18-bdd4-4bc3-a084-8c429802d2d5" (UID: "440e3a18-bdd4-4bc3-a084-8c429802d2d5"). InnerVolumeSpecName "kube-api-access-g7fw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.832459 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "440e3a18-bdd4-4bc3-a084-8c429802d2d5" (UID: "440e3a18-bdd4-4bc3-a084-8c429802d2d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.915237 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7fw4\" (UniqueName: \"kubernetes.io/projected/440e3a18-bdd4-4bc3-a084-8c429802d2d5-kube-api-access-g7fw4\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:09 crc kubenswrapper[4702]: I1203 12:30:09.915471 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/440e3a18-bdd4-4bc3-a084-8c429802d2d5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.010896 4702 generic.go:334] "Generic (PLEG): container finished" podID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerID="68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02" exitCode=0 Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.010984 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lns5d" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.013816 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lns5d" event={"ID":"440e3a18-bdd4-4bc3-a084-8c429802d2d5","Type":"ContainerDied","Data":"68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02"} Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.014036 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lns5d" event={"ID":"440e3a18-bdd4-4bc3-a084-8c429802d2d5","Type":"ContainerDied","Data":"c0083cef0a0808c5191a6587443040bd82d0bf5e30a81d74d9e62e84d5a4f3c5"} Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.014152 4702 scope.go:117] "RemoveContainer" containerID="68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.046569 4702 scope.go:117] "RemoveContainer" containerID="ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.072053 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lns5d"] Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.073974 4702 scope.go:117] "RemoveContainer" containerID="578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.079084 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lns5d"] Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.131886 4702 scope.go:117] "RemoveContainer" containerID="68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02" Dec 03 12:30:10 crc kubenswrapper[4702]: E1203 12:30:10.132514 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02\": container with ID starting with 68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02 not found: ID does not exist" containerID="68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.132562 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02"} err="failed to get container status \"68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02\": rpc error: code = NotFound desc = could not find container \"68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02\": container with ID starting with 68ee07d1990e19585e8a8c985fcdb24d87eb792da6cb605b90375685700e9e02 not found: ID does not exist" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.132591 4702 scope.go:117] "RemoveContainer" containerID="ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1" Dec 03 12:30:10 crc kubenswrapper[4702]: E1203 12:30:10.133098 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1\": container with ID starting with ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1 not found: ID does not exist" containerID="ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.133150 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1"} err="failed to get container status \"ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1\": rpc error: code = NotFound desc = could not find container \"ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1\": container with ID starting with ae1506c02b12f7506a634066841f33be65a6114bd5186c4376498f05912f61a1 not found: ID does not exist" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.133188 4702 scope.go:117] "RemoveContainer" containerID="578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac" Dec 03 12:30:10 crc kubenswrapper[4702]: E1203 12:30:10.133558 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac\": container with ID starting with 578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac not found: ID does not exist" containerID="578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.133586 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac"} err="failed to get container status \"578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac\": rpc error: code = NotFound desc = could not find container \"578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac\": container with ID starting with 578f72ff5a4a30122256283e1cafcc1f5390ac5b66df189630ea1d20c74ee9ac not found: ID does not exist" Dec 03 12:30:10 crc kubenswrapper[4702]: I1203 12:30:10.945416 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" path="/var/lib/kubelet/pods/440e3a18-bdd4-4bc3-a084-8c429802d2d5/volumes" Dec 03 12:30:29 crc kubenswrapper[4702]: I1203 12:30:29.975740 4702 scope.go:117] "RemoveContainer" containerID="6eeca76c4f6234cc177b88036a61469bd0a0f23a018c94775739582c9d86e78d" Dec 03 12:31:25 crc kubenswrapper[4702]: I1203 12:31:25.913240 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:31:25 crc kubenswrapper[4702]: I1203 12:31:25.921804 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:31:55 crc kubenswrapper[4702]: I1203 12:31:55.907677 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:31:55 crc kubenswrapper[4702]: I1203 12:31:55.908269 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:32:25 crc kubenswrapper[4702]: I1203 12:32:25.908054 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:32:25 crc kubenswrapper[4702]: I1203 12:32:25.910169 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:32:25 crc kubenswrapper[4702]: I1203 12:32:25.910315 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:32:25 crc kubenswrapper[4702]: I1203 12:32:25.912111 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:32:25 crc kubenswrapper[4702]: I1203 12:32:25.912188 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" gracePeriod=600 Dec 03 12:32:26 crc kubenswrapper[4702]: E1203 12:32:26.061594 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:32:26 crc kubenswrapper[4702]: I1203 12:32:26.802998 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" exitCode=0 Dec 03 12:32:26 crc kubenswrapper[4702]: I1203 12:32:26.803060 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42"} Dec 03 12:32:26 crc kubenswrapper[4702]: I1203 12:32:26.803181 4702 scope.go:117] "RemoveContainer" containerID="0de56d455f6a25d82d6775c8599fe924b6ccfb032b2e015fb228ca76142f5944" Dec 03 12:32:26 crc kubenswrapper[4702]: I1203 12:32:26.804273 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:32:26 crc kubenswrapper[4702]: E1203 12:32:26.804869 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:32:38 crc kubenswrapper[4702]: I1203 12:32:38.929919 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:32:38 crc kubenswrapper[4702]: E1203 12:32:38.930719 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:32:50 crc kubenswrapper[4702]: I1203 12:32:50.928362 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:32:50 crc kubenswrapper[4702]: E1203 12:32:50.929259 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:33:02 crc kubenswrapper[4702]: I1203 12:33:02.042154 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:33:02 crc kubenswrapper[4702]: E1203 12:33:02.043684 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:33:13 crc kubenswrapper[4702]: I1203 12:33:13.929048 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:33:13 crc kubenswrapper[4702]: E1203 12:33:13.930132 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:33:26 crc kubenswrapper[4702]: I1203 12:33:26.937095 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:33:26 crc kubenswrapper[4702]: E1203 12:33:26.938104 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:33:38 crc kubenswrapper[4702]: I1203 12:33:38.928311 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:33:38 crc kubenswrapper[4702]: E1203 12:33:38.929137 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:33:49 crc kubenswrapper[4702]: I1203 12:33:49.982292 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:33:49 crc kubenswrapper[4702]: E1203 12:33:49.992784 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:34:03 crc kubenswrapper[4702]: I1203 12:34:03.928149 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:34:03 crc kubenswrapper[4702]: E1203 12:34:03.929332 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:34:15 crc kubenswrapper[4702]: I1203 12:34:15.928519 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:34:15 crc kubenswrapper[4702]: E1203 12:34:15.929442 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.185988 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 12:34:27 crc kubenswrapper[4702]: E1203 12:34:27.186997 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerName="registry-server" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.187023 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerName="registry-server" Dec 03 12:34:27 crc kubenswrapper[4702]: E1203 12:34:27.187049 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerName="extract-content" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.187055 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerName="extract-content" Dec 03 12:34:27 crc kubenswrapper[4702]: E1203 12:34:27.187072 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerName="extract-utilities" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.187078 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerName="extract-utilities" Dec 03 12:34:27 crc kubenswrapper[4702]: E1203 12:34:27.187087 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75744f67-47a4-49d7-9465-aacdea59960f" containerName="collect-profiles" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.187094 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="75744f67-47a4-49d7-9465-aacdea59960f" containerName="collect-profiles" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.187370 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="75744f67-47a4-49d7-9465-aacdea59960f" containerName="collect-profiles" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.187395 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="440e3a18-bdd4-4bc3-a084-8c429802d2d5" containerName="registry-server" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.188446 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.193524 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qnhdw" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.194561 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.194859 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.195060 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.204318 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.308455 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.308539 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.308583 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.308615 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-config-data\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.308704 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.308839 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cflnc\" (UniqueName: \"kubernetes.io/projected/a35fd719-e341-49b9-b12f-f39f2402868b-kube-api-access-cflnc\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.308973 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.309109 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.309245 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.412029 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.412097 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.412568 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.413818 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.413949 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.413992 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.414021 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-config-data\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.414211 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.414357 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.414569 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.415130 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.415407 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.415590 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-config-data\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.415873 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cflnc\" (UniqueName: \"kubernetes.io/projected/a35fd719-e341-49b9-b12f-f39f2402868b-kube-api-access-cflnc\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.420531 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.421628 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.422552 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.436737 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cflnc\" (UniqueName: \"kubernetes.io/projected/a35fd719-e341-49b9-b12f-f39f2402868b-kube-api-access-cflnc\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.518694 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"tempest-tests-tempest\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " pod="openstack/tempest-tests-tempest" Dec 03 12:34:27 crc kubenswrapper[4702]: I1203 12:34:27.816229 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 12:34:28 crc kubenswrapper[4702]: I1203 12:34:28.427091 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 03 12:34:28 crc kubenswrapper[4702]: W1203 12:34:28.433188 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda35fd719_e341_49b9_b12f_f39f2402868b.slice/crio-2e2be505b314356dd3c8c6970b5ad97cd73f3cd9ce4af62d2455a0598d70d78e WatchSource:0}: Error finding container 2e2be505b314356dd3c8c6970b5ad97cd73f3cd9ce4af62d2455a0598d70d78e: Status 404 returned error can't find the container with id 2e2be505b314356dd3c8c6970b5ad97cd73f3cd9ce4af62d2455a0598d70d78e Dec 03 12:34:28 crc kubenswrapper[4702]: I1203 12:34:28.436770 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:34:28 crc kubenswrapper[4702]: I1203 12:34:28.550136 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a35fd719-e341-49b9-b12f-f39f2402868b","Type":"ContainerStarted","Data":"2e2be505b314356dd3c8c6970b5ad97cd73f3cd9ce4af62d2455a0598d70d78e"} Dec 03 12:34:29 crc kubenswrapper[4702]: I1203 12:34:29.928954 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:34:29 crc kubenswrapper[4702]: E1203 12:34:29.929791 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:34:43 crc kubenswrapper[4702]: I1203 12:34:43.928447 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:34:43 crc kubenswrapper[4702]: E1203 12:34:43.929220 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:34:51 crc kubenswrapper[4702]: I1203 12:34:51.037585 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:34:51 crc kubenswrapper[4702]: I1203 12:34:51.047994 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:34:51 crc kubenswrapper[4702]: I1203 12:34:51.064211 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:34:51 crc kubenswrapper[4702]: I1203 12:34:51.064629 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:34:51 crc kubenswrapper[4702]: I1203 12:34:51.250659 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:34:55 crc kubenswrapper[4702]: I1203 12:34:55.081968 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.198749388s: [/var/lib/containers/storage/overlay/f71516b19e70ff8792213815d9789143e2a2fa8eeaf44af6b4884a121f8a4690/diff /var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-w2vmt_182ca1cb-9499-4cf7-aeae-c35c7038814c/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:34:55 crc kubenswrapper[4702]: I1203 12:34:55.082234 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.397638985s: [/var/lib/containers/storage/overlay/d3eb8d3d2c000dc1af44aa3af0d6a6421d2cc471afe99c170b9d317ab611d1b9/diff /var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ckjgv_c43c86a0-692f-406f-871a-24a14f24ed77/operator/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:34:55 crc kubenswrapper[4702]: I1203 12:34:55.082495 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.39784813s: [/var/lib/containers/storage/overlay/d43226810ea7b5aa542e627f8e385c116f3e2cc6c2115ebd44a3060444a78460/diff /var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-gqqgw_1a7e4f08-8a48-44d5-944b-4eaf9d9518b5/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:34:55 crc kubenswrapper[4702]: I1203 12:34:55.082833 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.399420935s: [/var/lib/containers/storage/overlay/3d71f8c19938347c3df1105928891f2c3522b6266677fc21551f346662468c61/diff /var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-psnhp_b6faaca6-f017-42ac-95e4-d73ae3e8e519/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:34:55 crc kubenswrapper[4702]: I1203 12:34:55.082867 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.40063818s: [/var/lib/containers/storage/overlay/f40c97a6e619ddbfbf58fc5ea706030e0c30ebbe2ce8e18d922dbe3d4c965572/diff /var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-7xg4t_523c06cc-9816-4252-ac00-dc7928dae009/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:34:55 crc kubenswrapper[4702]: I1203 12:34:55.082897 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.399437815s: [/var/lib/containers/storage/overlay/89635eb3fa917a29f0b74b5713ecc8a2733a14abc913837d388e181e98b854d1/diff /var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-2pcqv_5cecb29f-7ef9-4177-8e01-a776b70bbb03/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:34:55 crc kubenswrapper[4702]: I1203 12:34:55.083085 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.400782754s: [/var/lib/containers/storage/overlay/282be4dc69f4a17351a6078359ec5ed0317a83ccd9bc722a93e23c0a2558e51e/diff /var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-kg6p7_4b90477f-d1b5-4f03-ab08-2476d44a9cff/manager/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:34:55 crc kubenswrapper[4702]: I1203 12:34:55.928171 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:34:55 crc kubenswrapper[4702]: E1203 12:34:55.928650 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:35:07 crc kubenswrapper[4702]: I1203 12:35:07.928530 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:35:07 crc kubenswrapper[4702]: E1203 12:35:07.929655 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:35:19 crc kubenswrapper[4702]: I1203 12:35:19.928872 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:35:19 crc kubenswrapper[4702]: E1203 12:35:19.929793 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:35:21 crc kubenswrapper[4702]: E1203 12:35:21.641744 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 03 12:35:21 crc kubenswrapper[4702]: E1203 12:35:21.643041 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cflnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(a35fd719-e341-49b9-b12f-f39f2402868b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:35:21 crc kubenswrapper[4702]: E1203 12:35:21.644197 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="a35fd719-e341-49b9-b12f-f39f2402868b" Dec 03 12:35:21 crc kubenswrapper[4702]: E1203 12:35:21.807292 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="a35fd719-e341-49b9-b12f-f39f2402868b" Dec 03 12:35:32 crc kubenswrapper[4702]: I1203 12:35:32.996090 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:35:32 crc kubenswrapper[4702]: E1203 12:35:32.996933 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:35:36 crc kubenswrapper[4702]: I1203 12:35:36.704300 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 03 12:35:40 crc kubenswrapper[4702]: I1203 12:35:40.130403 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a35fd719-e341-49b9-b12f-f39f2402868b","Type":"ContainerStarted","Data":"771fa941e4419f217c65a63b8471ce0a7afc720670abfe4560184430bbd27a7f"} Dec 03 12:35:40 crc kubenswrapper[4702]: I1203 12:35:40.174520 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.91100806 podStartE2EDuration="1m14.174478373s" podCreationTimestamp="2025-12-03 12:34:26 +0000 UTC" firstStartedPulling="2025-12-03 12:34:28.436323185 +0000 UTC m=+5452.272251649" lastFinishedPulling="2025-12-03 12:35:36.699793498 +0000 UTC m=+5520.535721962" observedRunningTime="2025-12-03 12:35:40.17049878 +0000 UTC m=+5524.006427244" watchObservedRunningTime="2025-12-03 12:35:40.174478373 +0000 UTC m=+5524.010406837" Dec 03 12:35:43 crc kubenswrapper[4702]: I1203 12:35:43.928597 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:35:43 crc kubenswrapper[4702]: E1203 12:35:43.929352 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.102423 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6nxzq"] Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.106054 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.127923 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6nxzq"] Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.186791 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-utilities\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.187177 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-catalog-content\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.187250 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j995\" (UniqueName: \"kubernetes.io/projected/00409c07-d625-4022-a89a-b0a0a4206781-kube-api-access-6j995\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.289406 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-catalog-content\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.289481 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j995\" (UniqueName: \"kubernetes.io/projected/00409c07-d625-4022-a89a-b0a0a4206781-kube-api-access-6j995\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.289584 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-utilities\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.290098 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-catalog-content\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.290136 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-utilities\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.316989 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j995\" (UniqueName: \"kubernetes.io/projected/00409c07-d625-4022-a89a-b0a0a4206781-kube-api-access-6j995\") pod \"community-operators-6nxzq\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:50 crc kubenswrapper[4702]: I1203 12:35:50.437518 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:35:52 crc kubenswrapper[4702]: I1203 12:35:52.456843 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6nxzq"] Dec 03 12:35:53 crc kubenswrapper[4702]: I1203 12:35:53.299678 4702 generic.go:334] "Generic (PLEG): container finished" podID="00409c07-d625-4022-a89a-b0a0a4206781" containerID="ded1e77dabac40fbe3cd962ca505f7cc5e6f1b79ac6df89c86c7d5ecfa117fef" exitCode=0 Dec 03 12:35:53 crc kubenswrapper[4702]: I1203 12:35:53.299884 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nxzq" event={"ID":"00409c07-d625-4022-a89a-b0a0a4206781","Type":"ContainerDied","Data":"ded1e77dabac40fbe3cd962ca505f7cc5e6f1b79ac6df89c86c7d5ecfa117fef"} Dec 03 12:35:53 crc kubenswrapper[4702]: I1203 12:35:53.300006 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nxzq" event={"ID":"00409c07-d625-4022-a89a-b0a0a4206781","Type":"ContainerStarted","Data":"834ba8128ba9efa718cacd52f31930f7205fdb281a0d798d5c509bac89bb4e5d"} Dec 03 12:35:55 crc kubenswrapper[4702]: I1203 12:35:55.325947 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nxzq" event={"ID":"00409c07-d625-4022-a89a-b0a0a4206781","Type":"ContainerStarted","Data":"48e9f0e122d46b7bbc43c415a354f0a52d033c7819b7238af1252af1223b595a"} Dec 03 12:35:55 crc kubenswrapper[4702]: I1203 12:35:55.928033 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:35:55 crc kubenswrapper[4702]: E1203 12:35:55.928585 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:35:56 crc kubenswrapper[4702]: I1203 12:35:56.353714 4702 generic.go:334] "Generic (PLEG): container finished" podID="00409c07-d625-4022-a89a-b0a0a4206781" containerID="48e9f0e122d46b7bbc43c415a354f0a52d033c7819b7238af1252af1223b595a" exitCode=0 Dec 03 12:35:56 crc kubenswrapper[4702]: I1203 12:35:56.353814 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nxzq" event={"ID":"00409c07-d625-4022-a89a-b0a0a4206781","Type":"ContainerDied","Data":"48e9f0e122d46b7bbc43c415a354f0a52d033c7819b7238af1252af1223b595a"} Dec 03 12:35:58 crc kubenswrapper[4702]: I1203 12:35:58.377640 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nxzq" event={"ID":"00409c07-d625-4022-a89a-b0a0a4206781","Type":"ContainerStarted","Data":"d878b48b2dd127b272463fbced4b932953dcc102db84ba31027ca13bd7dc29b7"} Dec 03 12:35:58 crc kubenswrapper[4702]: I1203 12:35:58.399717 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6nxzq" podStartSLOduration=4.570101955 podStartE2EDuration="8.39969828s" podCreationTimestamp="2025-12-03 12:35:50 +0000 UTC" firstStartedPulling="2025-12-03 12:35:53.302667156 +0000 UTC m=+5537.138595630" lastFinishedPulling="2025-12-03 12:35:57.132263501 +0000 UTC m=+5540.968191955" observedRunningTime="2025-12-03 12:35:58.397413234 +0000 UTC m=+5542.233341698" watchObservedRunningTime="2025-12-03 12:35:58.39969828 +0000 UTC m=+5542.235626744" Dec 03 12:36:00 crc kubenswrapper[4702]: I1203 12:36:00.437973 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:36:00 crc kubenswrapper[4702]: I1203 12:36:00.438385 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:36:01 crc kubenswrapper[4702]: I1203 12:36:01.585418 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6nxzq" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="registry-server" probeResult="failure" output=< Dec 03 12:36:01 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:36:01 crc kubenswrapper[4702]: > Dec 03 12:36:09 crc kubenswrapper[4702]: I1203 12:36:09.928453 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:36:09 crc kubenswrapper[4702]: E1203 12:36:09.929307 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:36:10 crc kubenswrapper[4702]: I1203 12:36:10.710853 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:36:11 crc kubenswrapper[4702]: I1203 12:36:11.428798 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:36:14 crc kubenswrapper[4702]: I1203 12:36:14.218051 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6nxzq"] Dec 03 12:36:14 crc kubenswrapper[4702]: I1203 12:36:14.219055 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6nxzq" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="registry-server" containerID="cri-o://d878b48b2dd127b272463fbced4b932953dcc102db84ba31027ca13bd7dc29b7" gracePeriod=2 Dec 03 12:36:14 crc kubenswrapper[4702]: I1203 12:36:14.857705 4702 generic.go:334] "Generic (PLEG): container finished" podID="00409c07-d625-4022-a89a-b0a0a4206781" containerID="d878b48b2dd127b272463fbced4b932953dcc102db84ba31027ca13bd7dc29b7" exitCode=0 Dec 03 12:36:14 crc kubenswrapper[4702]: I1203 12:36:14.858144 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nxzq" event={"ID":"00409c07-d625-4022-a89a-b0a0a4206781","Type":"ContainerDied","Data":"d878b48b2dd127b272463fbced4b932953dcc102db84ba31027ca13bd7dc29b7"} Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.469845 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.657204 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j995\" (UniqueName: \"kubernetes.io/projected/00409c07-d625-4022-a89a-b0a0a4206781-kube-api-access-6j995\") pod \"00409c07-d625-4022-a89a-b0a0a4206781\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.657372 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-utilities\") pod \"00409c07-d625-4022-a89a-b0a0a4206781\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.657420 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-catalog-content\") pod \"00409c07-d625-4022-a89a-b0a0a4206781\" (UID: \"00409c07-d625-4022-a89a-b0a0a4206781\") " Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.658106 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-utilities" (OuterVolumeSpecName: "utilities") pod "00409c07-d625-4022-a89a-b0a0a4206781" (UID: "00409c07-d625-4022-a89a-b0a0a4206781"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.658505 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.664467 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00409c07-d625-4022-a89a-b0a0a4206781-kube-api-access-6j995" (OuterVolumeSpecName: "kube-api-access-6j995") pod "00409c07-d625-4022-a89a-b0a0a4206781" (UID: "00409c07-d625-4022-a89a-b0a0a4206781"). InnerVolumeSpecName "kube-api-access-6j995". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.711181 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "00409c07-d625-4022-a89a-b0a0a4206781" (UID: "00409c07-d625-4022-a89a-b0a0a4206781"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.761720 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j995\" (UniqueName: \"kubernetes.io/projected/00409c07-d625-4022-a89a-b0a0a4206781-kube-api-access-6j995\") on node \"crc\" DevicePath \"\"" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.761789 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00409c07-d625-4022-a89a-b0a0a4206781-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.873391 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6nxzq" event={"ID":"00409c07-d625-4022-a89a-b0a0a4206781","Type":"ContainerDied","Data":"834ba8128ba9efa718cacd52f31930f7205fdb281a0d798d5c509bac89bb4e5d"} Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.873450 4702 scope.go:117] "RemoveContainer" containerID="d878b48b2dd127b272463fbced4b932953dcc102db84ba31027ca13bd7dc29b7" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.873509 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6nxzq" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.910939 4702 scope.go:117] "RemoveContainer" containerID="48e9f0e122d46b7bbc43c415a354f0a52d033c7819b7238af1252af1223b595a" Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.923513 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6nxzq"] Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.943259 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6nxzq"] Dec 03 12:36:15 crc kubenswrapper[4702]: I1203 12:36:15.958404 4702 scope.go:117] "RemoveContainer" containerID="ded1e77dabac40fbe3cd962ca505f7cc5e6f1b79ac6df89c86c7d5ecfa117fef" Dec 03 12:36:16 crc kubenswrapper[4702]: I1203 12:36:16.946132 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00409c07-d625-4022-a89a-b0a0a4206781" path="/var/lib/kubelet/pods/00409c07-d625-4022-a89a-b0a0a4206781/volumes" Dec 03 12:36:24 crc kubenswrapper[4702]: I1203 12:36:24.929213 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:36:24 crc kubenswrapper[4702]: E1203 12:36:24.930238 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:36:35 crc kubenswrapper[4702]: I1203 12:36:35.928395 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:36:35 crc kubenswrapper[4702]: E1203 12:36:35.931479 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:36:50 crc kubenswrapper[4702]: I1203 12:36:50.928713 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:36:50 crc kubenswrapper[4702]: E1203 12:36:50.931881 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:36:53 crc kubenswrapper[4702]: I1203 12:36:53.160324 4702 trace.go:236] Trace[1110829468]: "Calculate volume metrics of manager-config for pod openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" (03-Dec-2025 12:36:52.068) (total time: 1085ms): Dec 03 12:36:53 crc kubenswrapper[4702]: Trace[1110829468]: [1.085694752s] [1.085694752s] END Dec 03 12:36:53 crc kubenswrapper[4702]: I1203 12:36:53.206213 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.290101395s: [/var/lib/containers/storage/overlay/29722bba1eaf1211422a18276b5bb16af0770d15061df812c378a977a8faacc9/diff /var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9_1d86df9d-86a7-4980-abd0-488d98f6b2fb/kube-rbac-proxy/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:37:04 crc kubenswrapper[4702]: I1203 12:37:04.928160 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:37:04 crc kubenswrapper[4702]: E1203 12:37:04.929386 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:37:18 crc kubenswrapper[4702]: I1203 12:37:18.971746 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:37:18 crc kubenswrapper[4702]: E1203 12:37:18.973005 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:37:30 crc kubenswrapper[4702]: I1203 12:37:30.932028 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:37:32 crc kubenswrapper[4702]: I1203 12:37:32.294317 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"030d6a62e307050ecc85543445f98f0ffef93a52b32474be21649714e5532e7a"} Dec 03 12:38:03 crc kubenswrapper[4702]: I1203 12:38:03.237134 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:38:03 crc kubenswrapper[4702]: I1203 12:38:03.237137 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.193576 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sr26m"] Dec 03 12:38:41 crc kubenswrapper[4702]: E1203 12:38:41.196922 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="extract-utilities" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.196962 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="extract-utilities" Dec 03 12:38:41 crc kubenswrapper[4702]: E1203 12:38:41.196990 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="extract-content" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.196998 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="extract-content" Dec 03 12:38:41 crc kubenswrapper[4702]: E1203 12:38:41.197007 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="registry-server" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.197016 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="registry-server" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.198886 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="00409c07-d625-4022-a89a-b0a0a4206781" containerName="registry-server" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.205642 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.334912 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x647v\" (UniqueName: \"kubernetes.io/projected/d370ba60-2ec7-4904-8ada-85984ae3582b-kube-api-access-x647v\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.335346 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-utilities\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.335430 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-catalog-content\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.439730 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x647v\" (UniqueName: \"kubernetes.io/projected/d370ba60-2ec7-4904-8ada-85984ae3582b-kube-api-access-x647v\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.440406 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-utilities\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.440497 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-catalog-content\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.446025 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-utilities\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.446369 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-catalog-content\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.490579 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x647v\" (UniqueName: \"kubernetes.io/projected/d370ba60-2ec7-4904-8ada-85984ae3582b-kube-api-access-x647v\") pod \"certified-operators-sr26m\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.538597 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sr26m"] Dec 03 12:38:41 crc kubenswrapper[4702]: I1203 12:38:41.541467 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:38:43 crc kubenswrapper[4702]: I1203 12:38:43.222965 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sr26m"] Dec 03 12:38:43 crc kubenswrapper[4702]: I1203 12:38:43.953820 4702 generic.go:334] "Generic (PLEG): container finished" podID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerID="450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca" exitCode=0 Dec 03 12:38:43 crc kubenswrapper[4702]: I1203 12:38:43.954032 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerDied","Data":"450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca"} Dec 03 12:38:43 crc kubenswrapper[4702]: I1203 12:38:43.954467 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerStarted","Data":"9247a3e31e472287bbd102012315b1071a27157bee2e9e3e0467f29f345f6e0c"} Dec 03 12:38:48 crc kubenswrapper[4702]: I1203 12:38:48.006530 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerStarted","Data":"4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc"} Dec 03 12:38:50 crc kubenswrapper[4702]: I1203 12:38:50.227663 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:38:50 crc kubenswrapper[4702]: I1203 12:38:50.231131 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:38:50 crc kubenswrapper[4702]: I1203 12:38:50.227658 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:38:50 crc kubenswrapper[4702]: I1203 12:38:50.231293 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:38:50 crc kubenswrapper[4702]: I1203 12:38:50.371145 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" probeResult="failure" output=< Dec 03 12:38:50 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:38:50 crc kubenswrapper[4702]: > Dec 03 12:38:50 crc kubenswrapper[4702]: I1203 12:38:50.374855 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" probeResult="failure" output=< Dec 03 12:38:50 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:38:50 crc kubenswrapper[4702]: > Dec 03 12:38:50 crc kubenswrapper[4702]: I1203 12:38:50.971332 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:38:52 crc kubenswrapper[4702]: I1203 12:38:52.526977 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:38:52 crc kubenswrapper[4702]: I1203 12:38:52.526982 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:38:53 crc kubenswrapper[4702]: I1203 12:38:53.154250 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerDied","Data":"4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc"} Dec 03 12:38:53 crc kubenswrapper[4702]: I1203 12:38:53.154571 4702 generic.go:334] "Generic (PLEG): container finished" podID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerID="4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc" exitCode=0 Dec 03 12:38:55 crc kubenswrapper[4702]: I1203 12:38:55.188698 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerStarted","Data":"c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc"} Dec 03 12:38:55 crc kubenswrapper[4702]: I1203 12:38:55.229547 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sr26m" podStartSLOduration=5.057384033 podStartE2EDuration="15.228862312s" podCreationTimestamp="2025-12-03 12:38:40 +0000 UTC" firstStartedPulling="2025-12-03 12:38:43.95937015 +0000 UTC m=+5707.795298614" lastFinishedPulling="2025-12-03 12:38:54.130848429 +0000 UTC m=+5717.966776893" observedRunningTime="2025-12-03 12:38:55.212951998 +0000 UTC m=+5719.048880462" watchObservedRunningTime="2025-12-03 12:38:55.228862312 +0000 UTC m=+5719.064790776" Dec 03 12:39:00 crc kubenswrapper[4702]: I1203 12:39:00.896315 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.619674 4702 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-xrnp2 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.620363 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" podUID="508c1eef-dbbc-4c32-8d2e-dbb797c72461" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.624861 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.624920 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.626653 4702 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-bhqrp container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.626709 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podUID="042cc406-7960-493a-a19a-cb5590f8ff1f" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.792551 4702 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-dflgw container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.792643 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" podUID="dbea18cb-2e45-4d86-bc00-17a82f0a78ff" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.997369 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.997465 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.998153 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:01 crc kubenswrapper[4702]: I1203 12:39:01.998208 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:02 crc kubenswrapper[4702]: I1203 12:39:02.017164 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:02 crc kubenswrapper[4702]: I1203 12:39:02.017227 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:02 crc kubenswrapper[4702]: I1203 12:39:02.017555 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:02 crc kubenswrapper[4702]: I1203 12:39:02.017638 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:02 crc kubenswrapper[4702]: I1203 12:39:02.682209 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:02 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:02 crc kubenswrapper[4702]: > Dec 03 12:39:03 crc kubenswrapper[4702]: I1203 12:39:03.528986 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:03 crc kubenswrapper[4702]: I1203 12:39:03.529081 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:03 crc kubenswrapper[4702]: I1203 12:39:03.529409 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:03 crc kubenswrapper[4702]: I1203 12:39:03.529120 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.781479 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjmph"] Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.790177 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.795277 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-utilities\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.795363 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-catalog-content\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.795507 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl8kp\" (UniqueName: \"kubernetes.io/projected/c55aad18-42f1-4b14-a5fb-686c7a669d40-kube-api-access-zl8kp\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.899848 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl8kp\" (UniqueName: \"kubernetes.io/projected/c55aad18-42f1-4b14-a5fb-686c7a669d40-kube-api-access-zl8kp\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.900058 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-utilities\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.900101 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-catalog-content\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.901207 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-catalog-content\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:07 crc kubenswrapper[4702]: I1203 12:39:07.902391 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-utilities\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:08 crc kubenswrapper[4702]: I1203 12:39:08.057260 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl8kp\" (UniqueName: \"kubernetes.io/projected/c55aad18-42f1-4b14-a5fb-686c7a669d40-kube-api-access-zl8kp\") pod \"redhat-operators-gjmph\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:08 crc kubenswrapper[4702]: I1203 12:39:08.150315 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.211219 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" podUID="8f6320ff-4661-46be-80e1-8d97f09fe789" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.319031 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" podUID="7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.99:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.319166 4702 patch_prober.go:28] interesting pod/route-controller-manager-84cf75c7c5-96cmd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.320488 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.319450 4702 patch_prober.go:28] interesting pod/route-controller-manager-84cf75c7c5-96cmd container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.320637 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.402047 4702 patch_prober.go:28] interesting pod/controller-manager-548478b8dd-9254p container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.403039 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.402142 4702 patch_prober.go:28] interesting pod/controller-manager-548478b8dd-9254p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.403533 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.549193 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.703916 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" podUID="e3c1b694-60b8-4b5d-b8d5-40418e60aa4b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.745027 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.786957 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" podUID="b877c7a7-0b88-4238-8a21-314ef1525996" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:09 crc kubenswrapper[4702]: I1203 12:39:09.853006 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" podUID="40ccf765-6eb2-49e3-8f2c-635b1981639e" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.159999 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.160114 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.160198 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.160218 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.262060 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.262153 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.262297 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.262317 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.527245 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.527249 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.592183 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.592327 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" podUID="5e7b4134-2b34-4b36-ad61-8e681df197df" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.739017 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" podUID="5cecb29f-7ef9-4177-8e01-a776b70bbb03" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:10 crc kubenswrapper[4702]: I1203 12:39:10.779959 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" podUID="84fc908a-9418-4e6e-ac17-9e725524f9ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.066019 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.107120 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" podUID="5edf270b-74cb-42d2-82dc-7953f243c6dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.107551 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.190995 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.191001 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.191192 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" podUID="b3a5cd30-f098-4e9c-bbb0-f45305893017" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.94:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.191288 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" podUID="b3a5cd30-f098-4e9c-bbb0-f45305893017" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.94:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.191625 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-wh75l" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.193378 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"2e9901bec8566076e75f91059a59d717d36b63bac1c4bbb77850591b24d67bb2"} pod="metallb-system/frr-k8s-wh75l" containerMessage="Container frr failed liveness probe, will be restarted" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.193719 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" containerID="cri-o://2e9901bec8566076e75f91059a59d717d36b63bac1c4bbb77850591b24d67bb2" gracePeriod=2 Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.232613 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.274202 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podUID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.274163 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.339519 4702 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-xrnp2 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.339619 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" podUID="508c1eef-dbbc-4c32-8d2e-dbb797c72461" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.525834 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.526118 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.575447 4702 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-bhqrp container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.575737 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podUID="042cc406-7960-493a-a19a-cb5590f8ff1f" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.702425 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.702545 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.789225 4702 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-dflgw container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": context deadline exceeded" start-of-body= Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.789307 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" podUID="dbea18cb-2e45-4d86-bc00-17a82f0a78ff" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": context deadline exceeded" Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.998273 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.998365 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:11 crc kubenswrapper[4702]: I1203 12:39:11.998385 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:11.998441 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.016816 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.016875 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.016824 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.017021 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.526996 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.527098 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.527492 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.527578 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.528500 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.531175 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.850978 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.851014 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.851043 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.851140 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:12 crc kubenswrapper[4702]: I1203 12:39:12.960380 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"c7ed9f1ef289cc5aacad80a6cda5674b5b2d84db5935b0d3f1326bc5cd93e425"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.062793 4702 trace.go:236] Trace[1475146372]: "Calculate volume metrics of operator-scripts for pod openstack/openstack-cell1-galera-0" (03-Dec-2025 12:39:11.941) (total time: 1101ms): Dec 03 12:39:13 crc kubenswrapper[4702]: Trace[1475146372]: [1.101556875s] [1.101556875s] END Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.070809 4702 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.071124 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="358cd791-ccf7-4655-b446-b800598e773c" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.085173 4702 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.81:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.085253 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="07e2709a-6aac-4b21-8fc8-bfc21992aae3" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.81:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.133907 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.133931 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.133980 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.134002 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.534804 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:13 crc kubenswrapper[4702]: I1203 12:39:13.534804 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:14 crc kubenswrapper[4702]: I1203 12:39:14.028699 4702 patch_prober.go:28] interesting pod/nmstate-webhook-5f6d4c5ccb-r6jd6 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.87:9443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:14 crc kubenswrapper[4702]: I1203 12:39:14.029485 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" podUID="9370f81f-7868-4a16-9cec-7786257cdcbd" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.87:9443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:14 crc kubenswrapper[4702]: I1203 12:39:14.110502 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerDied","Data":"2e9901bec8566076e75f91059a59d717d36b63bac1c4bbb77850591b24d67bb2"} Dec 03 12:39:14 crc kubenswrapper[4702]: I1203 12:39:14.110960 4702 generic.go:334] "Generic (PLEG): container finished" podID="7643d370-6497-4a94-b0e7-2db66b56b687" containerID="2e9901bec8566076e75f91059a59d717d36b63bac1c4bbb77850591b24d67bb2" exitCode=143 Dec 03 12:39:14 crc kubenswrapper[4702]: I1203 12:39:14.541076 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:14 crc kubenswrapper[4702]: I1203 12:39:14.541195 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:14 crc kubenswrapper[4702]: I1203 12:39:14.835528 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" podUID="40ccf765-6eb2-49e3-8f2c-635b1981639e" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:14 crc kubenswrapper[4702]: I1203 12:39:14.835542 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" podUID="40ccf765-6eb2-49e3-8f2c-635b1981639e" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.019018 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.42:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.534803 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-engine-79b68c69ff-kvztw" podUID="a30c0f33-d7bc-456c-be27-26e860ca8f28" containerName="heat-engine" probeResult="failure" output="command timed out" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.534879 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zxwqw" podUID="9ec2138c-eb31-401f-b62d-d2823fe0523f" containerName="nmstate-handler" probeResult="failure" output="command timed out" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.534921 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.535037 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.536015 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-engine-79b68c69ff-kvztw" podUID="a30c0f33-d7bc-456c-be27-26e860ca8f28" containerName="heat-engine" probeResult="failure" output="command timed out" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.536146 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.536154 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.898005 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" podUID="a7faac4b-b558-4106-af27-4daf6a1db1af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.939530 4702 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v6p66 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": context deadline exceeded" start-of-body= Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.949449 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" podUID="5bad766d-e524-4670-b353-56e92df2f744" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": context deadline exceeded" Dec 03 12:39:15 crc kubenswrapper[4702]: I1203 12:39:15.949489 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" podUID="a7faac4b-b558-4106-af27-4daf6a1db1af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:15.957055 4702 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:15.957105 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.009360 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.009420 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.009498 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.009418 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292152 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292527 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292143 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292648 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292161 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292196 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292795 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292210 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292837 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292224 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292869 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.292755 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.527987 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.528389 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.532216 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.534463 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.534594 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.564393 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.564490 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.564570 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.564647 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.669930 4702 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-vchd7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.669930 4702 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-vchd7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.670061 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.669997 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.701988 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/ready\": context deadline exceeded" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.702040 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.898248 4702 patch_prober.go:28] interesting pod/metrics-server-6975dd785d-5bvc2 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.76:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:16 crc kubenswrapper[4702]: I1203 12:39:16.898607 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" podUID="56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.76:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022036 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022087 4702 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5688675f7c-q6w79 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022135 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022166 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" podUID="672e4a37-26c7-4378-a524-57fba88aec53" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022235 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022330 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022426 4702 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5688675f7c-q6w79 container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.48:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022450 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" podUID="672e4a37-26c7-4378-a524-57fba88aec53" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022479 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022549 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022562 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022589 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022678 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022705 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022775 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.022813 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.041666 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.044866 4702 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-8p7q4 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.61:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.044752 4702 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-8p7q4 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.61:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.044967 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" podUID="11c38a7f-7709-4e96-b309-c987c2301610" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.61:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.044977 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" podUID="11c38a7f-7709-4e96-b309-c987c2301610" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.61:5000/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.194666 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-wh75l" event={"ID":"7643d370-6497-4a94-b0e7-2db66b56b687","Type":"ContainerStarted","Data":"a1c37722e3fdebc0f2250b4a21a8fa6d3fdf26981fbc839b5e9162f71252884a"} Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.269671 4702 patch_prober.go:28] interesting pod/console-74bd8cbfc6-krjzs container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.136:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.270087 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-74bd8cbfc6-krjzs" podUID="0248a046-0fe5-47b8-a644-50dfc9a20a75" containerName="console" probeResult="failure" output="Get \"https://10.217.0.136:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.359738 4702 patch_prober.go:28] interesting pod/monitoring-plugin-74f4cdd6c8-9czrj container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.359903 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" podUID="7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.628049 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" podUID="1d86df9d-86a7-4980-abd0-488d98f6b2fb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.628049 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" podUID="1d86df9d-86a7-4980-abd0-488d98f6b2fb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.943843 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.943866 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.943926 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:17 crc kubenswrapper[4702]: I1203 12:39:17.943992 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.237689 4702 trace.go:236] Trace[596787450]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-compactor-0" (03-Dec-2025 12:39:12.983) (total time: 5253ms): Dec 03 12:39:18 crc kubenswrapper[4702]: Trace[596787450]: [5.253220538s] [5.253220538s] END Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.237694 4702 trace.go:236] Trace[341624251]: "Calculate volume metrics of registry-storage for pod openshift-image-registry/image-registry-66df7c8f76-8p7q4" (03-Dec-2025 12:39:08.890) (total time: 9346ms): Dec 03 12:39:18 crc kubenswrapper[4702]: Trace[341624251]: [9.346424654s] [9.346424654s] END Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.427984 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.453270 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:18 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:18 crc kubenswrapper[4702]: > Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.526417 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.526417 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.526507 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.530447 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.670411 4702 patch_prober.go:28] interesting pod/thanos-querier-58bc8556f9-kwpsl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.74:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:18 crc kubenswrapper[4702]: I1203 12:39:18.670754 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" podUID="b2347d45-1235-4ed0-9f48-6e0eb4c781f8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.74:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.016338 4702 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.016481 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.253135 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" podUID="8f6320ff-4661-46be-80e1-8d97f09fe789" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.336071 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" podUID="530ef793-9485-4c45-86ba-531906f2085a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.102:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.336141 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" podUID="8f6320ff-4661-46be-80e1-8d97f09fe789" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.418011 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" podUID="182ca1cb-9499-4cf7-aeae-c35c7038814c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.101:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.459185 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" podUID="182ca1cb-9499-4cf7-aeae-c35c7038814c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.101:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.608470 4702 patch_prober.go:28] interesting pod/route-controller-manager-84cf75c7c5-96cmd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.608602 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.608777 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" podUID="7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.99:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.608938 4702 patch_prober.go:28] interesting pod/route-controller-manager-84cf75c7c5-96cmd container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.608956 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.608989 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" podUID="530ef793-9485-4c45-86ba-531906f2085a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.609035 4702 patch_prober.go:28] interesting pod/controller-manager-548478b8dd-9254p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.609055 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.609095 4702 patch_prober.go:28] interesting pod/controller-manager-548478b8dd-9254p container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.609134 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.609180 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" podUID="6e99cffd-b82e-46c9-8cbd-fe8c24507385" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.91:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.690979 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" podUID="7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.99:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.691103 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podUID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.734078 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.775017 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:19 crc kubenswrapper[4702]: I1203 12:39:19.854315 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-wh75l" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.031989 4702 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79n82 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.032319 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" podUID="f2c1609d-33a3-444f-9370-24495b15b3e0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.032121 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podUID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.278984 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" podUID="e3c1b694-60b8-4b5d-b8d5-40418e60aa4b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.361182 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" podUID="b877c7a7-0b88-4238-8a21-314ef1525996" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.361237 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" podUID="1a7e4f08-8a48-44d5-944b-4eaf9d9518b5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.403326 4702 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79n82 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.403417 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" podUID="f2c1609d-33a3-444f-9370-24495b15b3e0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.403892 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" podUID="1a7e4f08-8a48-44d5-944b-4eaf9d9518b5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.404296 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" podUID="224e5de0-3f58-4243-80e5-212cf016ea46" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.404456 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" podUID="224e5de0-3f58-4243-80e5-212cf016ea46" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.405296 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" podUID="9b295e92-630f-4544-b741-50ece5e79f4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.406095 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.406175 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.489947 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" podUID="e3c1b694-60b8-4b5d-b8d5-40418e60aa4b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.489973 4702 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-55ql9 container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.490003 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.490053 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podUID="89d80ae9-23a4-4c91-a04d-7343d8a4df05" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.490078 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.490093 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.490200 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.490276 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" podUID="9b295e92-630f-4544-b741-50ece5e79f4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.532348 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" podUID="4b90477f-d1b5-4f03-ab08-2476d44a9cff" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.532502 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.532531 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.533400 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" podUID="4b90477f-d1b5-4f03-ab08-2476d44a9cff" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.534370 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" podUID="b877c7a7-0b88-4238-8a21-314ef1525996" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.535020 4702 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-55ql9 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.535107 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podUID="89d80ae9-23a4-4c91-a04d-7343d8a4df05" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.539349 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.539434 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ovn-northd-0" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.539903 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" probeResult="failure" output="command timed out" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.539973 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.543193 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ovn-northd" containerStatusID={"Type":"cri-o","ID":"95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038"} pod="openstack/ovn-northd-0" containerMessage="Container ovn-northd failed liveness probe, will be restarted" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.543966 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" containerID="cri-o://95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" gracePeriod=30 Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.701028 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.701592 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" podUID="523c06cc-9816-4252-ac00-dc7928dae009" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.742232 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.824085 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.865016 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" podUID="523c06cc-9816-4252-ac00-dc7928dae009" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.865144 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:20 crc kubenswrapper[4702]: I1203 12:39:20.948658 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" podUID="5cecb29f-7ef9-4177-8e01-a776b70bbb03" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.032587 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" podUID="84fc908a-9418-4e6e-ac17-9e725524f9ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.032742 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" podUID="b6faaca6-f017-42ac-95e4-d73ae3e8e519" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.201212 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" podUID="5cecb29f-7ef9-4177-8e01-a776b70bbb03" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.201212 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" podUID="b3a5cd30-f098-4e9c-bbb0-f45305893017" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.94:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.366090 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.366199 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.449070 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" podUID="84fc908a-9418-4e6e-ac17-9e725524f9ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.449158 4702 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.449070 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.528385 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.528588 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.532228 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.532332 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.559484 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616032 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podUID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616083 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616208 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" podUID="5edf270b-74cb-42d2-82dc-7953f243c6dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616400 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616384 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" podUID="5edf270b-74cb-42d2-82dc-7953f243c6dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616432 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616078 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-slqp5" podUID="b3a5cd30-f098-4e9c-bbb0-f45305893017" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.94:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616831 4702 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-xrnp2 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616861 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" podUID="508c1eef-dbbc-4c32-8d2e-dbb797c72461" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616856 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="30059ea4-152f-420c-b8cc-234ebab96b47" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.251:8080/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616886 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="30059ea4-152f-420c-b8cc-234ebab96b47" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.251:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616944 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.616949 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.617225 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.617309 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podUID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.617402 4702 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-bhqrp container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.617428 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podUID="042cc406-7960-493a-a19a-cb5590f8ff1f" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.617489 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.702279 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.702313 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.702511 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.789296 4702 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-dflgw container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.789395 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" podUID="dbea18cb-2e45-4d86-bc00-17a82f0a78ff" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.789485 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.963039 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.963117 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.997355 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.997452 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.997558 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": context deadline exceeded" start-of-body= Dec 03 12:39:21 crc kubenswrapper[4702]: I1203 12:39:21.997626 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": context deadline exceeded" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.017294 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.017652 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.017421 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.017901 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.336932 4702 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-xrnp2 container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.337294 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" podUID="508c1eef-dbbc-4c32-8d2e-dbb797c72461" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.465619 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.465772 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.527827 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.575601 4702 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-bhqrp container/loki-querier namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.575745 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podUID="042cc406-7960-493a-a19a-cb5590f8ff1f" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.618360 4702 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-bhqrp container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.618427 4702 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-xrnp2 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.618526 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" podUID="508c1eef-dbbc-4c32-8d2e-dbb797c72461" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.618451 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podUID="042cc406-7960-493a-a19a-cb5590f8ff1f" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.851061 4702 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-dflgw container/loki-query-frontend namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.851160 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" podUID="dbea18cb-2e45-4d86-bc00-17a82f0a78ff" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.851252 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.851298 4702 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-dflgw container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.851316 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" podUID="dbea18cb-2e45-4d86-bc00-17a82f0a78ff" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.851354 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.851402 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.851420 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.997845 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.997943 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.997888 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:22 crc kubenswrapper[4702]: I1203 12:39:22.998095 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.017914 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.018020 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/live\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.018135 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.57:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.018248 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.070111 4702 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.070221 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="358cd791-ccf7-4655-b446-b800598e773c" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.086351 4702 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.81:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.086437 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="07e2709a-6aac-4b21-8fc8-bfc21992aae3" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.81:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.136272 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.136317 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.136377 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.136391 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.544975 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" podUID="6ab9886b-f724-43b9-b365-4896d35349b9" containerName="sbdb" probeResult="failure" output="command timed out" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.545095 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.545194 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.548242 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.548341 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.550905 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-54ljx" podUID="6ab9886b-f724-43b9-b365-4896d35349b9" containerName="nbdb" probeResult="failure" output="command timed out" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.638201 4702 patch_prober.go:28] interesting pod/thanos-querier-58bc8556f9-kwpsl container/kube-rbac-proxy-web namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.74:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.638280 4702 patch_prober.go:28] interesting pod/thanos-querier-58bc8556f9-kwpsl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.74:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.638380 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" podUID="b2347d45-1235-4ed0-9f48-6e0eb4c781f8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.74:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.638293 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" podUID="b2347d45-1235-4ed0-9f48-6e0eb4c781f8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.74:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.844956 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.75:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.845036 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.75:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.950330 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:23 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:23 crc kubenswrapper[4702]: > Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.950424 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:23 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:23 crc kubenswrapper[4702]: > Dec 03 12:39:23 crc kubenswrapper[4702]: E1203 12:39:23.951231 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:23 crc kubenswrapper[4702]: E1203 12:39:23.953578 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.953615 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.953655 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.953693 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.953735 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.953741 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.953806 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 12:39:23 crc kubenswrapper[4702]: E1203 12:39:23.955337 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:23 crc kubenswrapper[4702]: E1203 12:39:23.955369 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.955447 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"56666e8a78a82b8a80949b132ba3edf357fbbc19b68d6b6c216a6290e3608589"} pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Dec 03 12:39:23 crc kubenswrapper[4702]: I1203 12:39:23.956398 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" containerID="cri-o://56666e8a78a82b8a80949b132ba3edf357fbbc19b68d6b6c216a6290e3608589" gracePeriod=30 Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.031057 4702 patch_prober.go:28] interesting pod/nmstate-webhook-5f6d4c5ccb-r6jd6 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.87:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.031145 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" podUID="9370f81f-7868-4a16-9cec-7786257cdcbd" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.87:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.070501 4702 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.80:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.070581 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-compactor-0" podUID="358cd791-ccf7-4655-b446-b800598e773c" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.80:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.085895 4702 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.81:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.085972 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="07e2709a-6aac-4b21-8fc8-bfc21992aae3" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.81:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.304049 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.527001 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zxwqw" podUID="9ec2138c-eb31-401f-b62d-d2823fe0523f" containerName="nmstate-handler" probeResult="failure" output="command timed out" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.527001 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.532149 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.532377 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.542476 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"a3ebf64628d67a49aab3d7c8c37af7c6fbdf0e61098265182dbf34743899b6e5"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.542652 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" containerID="cri-o://a3ebf64628d67a49aab3d7c8c37af7c6fbdf0e61098265182dbf34743899b6e5" gracePeriod=30 Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.661851 4702 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-nx9wv container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.661982 4702 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-nx9wv container/oauth-apiserver namespace/openshift-oauth-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.8:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.662347 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" podUID="d7be37fc-7374-46ae-a0e3-1cafab3430ec" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.662240 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" podUID="d7be37fc-7374-46ae-a0e3-1cafab3430ec" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.704444 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.954515 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:24 crc kubenswrapper[4702]: I1203 12:39:24.954596 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.527169 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.531328 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.531489 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.535343 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.793561 4702 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.793681 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.819505 4702 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v6p66 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.819589 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" podUID="5bad766d-e524-4670-b353-56e92df2f744" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.895089 4702 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.909210 4702 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5688675f7c-q6w79 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": dial tcp 10.217.0.48:8081: connect: connection refused" start-of-body= Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.909268 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" podUID="672e4a37-26c7-4378-a524-57fba88aec53" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": dial tcp 10.217.0.48:8081: connect: connection refused" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.947101 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.947520 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.951576 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:25 crc kubenswrapper[4702]: I1203 12:39:25.951669 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.218110 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.218245 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300028 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300113 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300178 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300113 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300115 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300272 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300143 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300315 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300354 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.300371 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.528810 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.530318 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.530485 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.530850 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.565035 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.565047 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.565519 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.565445 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.669994 4702 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-vchd7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.670356 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.670020 4702 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-vchd7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.670482 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.702161 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.867080 4702 patch_prober.go:28] interesting pod/metrics-server-6975dd785d-5bvc2 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.867140 4702 patch_prober.go:28] interesting pod/metrics-server-6975dd785d-5bvc2 container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.867192 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" podUID="56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.867255 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" podUID="56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.905379 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.905452 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.905920 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:26 crc kubenswrapper[4702]: I1203 12:39:26.906023 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:26.984331 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:26.984408 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:26.984607 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:26.984672 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:26.996951 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:26.997024 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:26.997086 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:26.997112 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.017736 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.017864 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.017869 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.017960 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.027061 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="Get \"http://10.217.0.42:9898/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.270729 4702 patch_prober.go:28] interesting pod/console-74bd8cbfc6-krjzs container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.136:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.272336 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-74bd8cbfc6-krjzs" podUID="0248a046-0fe5-47b8-a644-50dfc9a20a75" containerName="console" probeResult="failure" output="Get \"https://10.217.0.136:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.360232 4702 patch_prober.go:28] interesting pod/monitoring-plugin-74f4cdd6c8-9czrj container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.360543 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" podUID="7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.369416 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" event={"ID":"672e4a37-26c7-4378-a524-57fba88aec53","Type":"ContainerDied","Data":"32a69511f4925e8eb4a71c4bc1a97060fed7d4e0e1a027fbb60f458f02518a79"} Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.369974 4702 generic.go:334] "Generic (PLEG): container finished" podID="672e4a37-26c7-4378-a524-57fba88aec53" containerID="32a69511f4925e8eb4a71c4bc1a97060fed7d4e0e1a027fbb60f458f02518a79" exitCode=1 Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.371868 4702 scope.go:117] "RemoveContainer" containerID="32a69511f4925e8eb4a71c4bc1a97060fed7d4e0e1a027fbb60f458f02518a79" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.374128 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9/ovn-northd/0.log" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.374299 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9","Type":"ContainerDied","Data":"95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038"} Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.374178 4702 generic.go:334] "Generic (PLEG): container finished" podID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" exitCode=139 Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.627079 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" podUID="1d86df9d-86a7-4980-abd0-488d98f6b2fb" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:27 crc kubenswrapper[4702]: I1203 12:39:27.705228 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.143873 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:28 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:28 crc kubenswrapper[4702]: > Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.144342 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:28 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:28 crc kubenswrapper[4702]: > Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.145152 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:28 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:28 crc kubenswrapper[4702]: > Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.148863 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:28 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:28 crc kubenswrapper[4702]: > Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.340877 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.665356 4702 patch_prober.go:28] interesting pod/thanos-querier-58bc8556f9-kwpsl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.74:9091/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.665443 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" podUID="b2347d45-1235-4ed0-9f48-6e0eb4c781f8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.74:9091/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.942789 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 03 12:39:28 crc kubenswrapper[4702]: I1203 12:39:28.943047 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 03 12:39:28 crc kubenswrapper[4702]: E1203 12:39:28.943945 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038 is running failed: container process not found" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:28 crc kubenswrapper[4702]: E1203 12:39:28.944555 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038 is running failed: container process not found" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:28 crc kubenswrapper[4702]: E1203 12:39:28.945150 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038 is running failed: container process not found" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:28 crc kubenswrapper[4702]: E1203 12:39:28.945198 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.059804 4702 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.059878 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.213063 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" podUID="8f6320ff-4661-46be-80e1-8d97f09fe789" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.213678 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.255073 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" podUID="530ef793-9485-4c45-86ba-531906f2085a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.102:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.338011 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" podUID="182ca1cb-9499-4cf7-aeae-c35c7038814c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.101:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.380989 4702 patch_prober.go:28] interesting pod/route-controller-manager-84cf75c7c5-96cmd container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.380995 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" podUID="7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.99:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381128 4702 patch_prober.go:28] interesting pod/route-controller-manager-84cf75c7c5-96cmd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381183 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" podUID="6e99cffd-b82e-46c9-8cbd-fe8c24507385" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.91:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381066 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381253 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381257 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381291 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381422 4702 patch_prober.go:28] interesting pod/controller-manager-548478b8dd-9254p container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381450 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381489 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.381644 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.382675 4702 patch_prober.go:28] interesting pod/controller-manager-548478b8dd-9254p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.382715 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.383040 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller-manager" containerStatusID={"Type":"cri-o","ID":"97a0cf5f74bb303168d42c86dbc2d9abc4f6775d6c7ffd7df4510df0fb0a6fc2"} pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" containerMessage="Container controller-manager failed liveness probe, will be restarted" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.383420 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" containerID="cri-o://97a0cf5f74bb303168d42c86dbc2d9abc4f6775d6c7ffd7df4510df0fb0a6fc2" gracePeriod=30 Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.383194 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"4263520eca4d7e37a135b540dd1b3cfa0adf6e70430cc456be62501d5508459b"} pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.383595 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" containerID="cri-o://4263520eca4d7e37a135b540dd1b3cfa0adf6e70430cc456be62501d5508459b" gracePeriod=30 Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.504204 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podUID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.530125 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.591104 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.591261 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.675026 4702 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79n82 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.675047 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.675354 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" podUID="f2c1609d-33a3-444f-9370-24495b15b3e0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.675424 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.757154 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" podUID="1a7e4f08-8a48-44d5-944b-4eaf9d9518b5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.757351 4702 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79n82 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.757396 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-79n82" podUID="f2c1609d-33a3-444f-9370-24495b15b3e0" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.798133 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" podUID="4b90477f-d1b5-4f03-ab08-2476d44a9cff" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.881148 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" podUID="9b295e92-630f-4544-b741-50ece5e79f4c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.881554 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" podUID="224e5de0-3f58-4243-80e5-212cf016ea46" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.922050 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" podUID="b877c7a7-0b88-4238-8a21-314ef1525996" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.922499 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.963562 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-5655c58dd6-l8p84" podUID="40ccf765-6eb2-49e3-8f2c-635b1981639e" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.44:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.963552 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" podUID="e3c1b694-60b8-4b5d-b8d5-40418e60aa4b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:29 crc kubenswrapper[4702]: I1203 12:39:29.963733 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187000 4702 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-55ql9 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187020 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187070 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podUID="89d80ae9-23a4-4c91-a04d-7343d8a4df05" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187090 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187101 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187126 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187212 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187245 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187302 4702 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-55ql9 container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.187329 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-d8bb48f5d-55ql9" podUID="89d80ae9-23a4-4c91-a04d-7343d8a4df05" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.6:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.188314 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="oauth-openshift" containerStatusID={"Type":"cri-o","ID":"ed23f0bb952245c24c0ac2a111b381a15a1284a0fcaa9a57735a4ef6e66d92d0"} pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" containerMessage="Container oauth-openshift failed liveness probe, will be restarted" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.269258 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.269340 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.269486 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.310105 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.310198 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.310280 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.310114 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" podUID="8f6320ff-4661-46be-80e1-8d97f09fe789" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.100:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.415314 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="perses-operator" containerStatusID={"Type":"cri-o","ID":"57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca"} pod="openshift-operators/perses-operator-5446b9c989-vpr8z" containerMessage="Container perses-operator failed liveness probe, will be restarted" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.415451 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" containerID="cri-o://57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca" gracePeriod=30 Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.416215 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"958529641adcb16dde835961452da3fb825961812b39cf59f9126e93469f5643"} pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" containerMessage="Container webhook-server failed liveness probe, will be restarted" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.416249 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" containerID="cri-o://958529641adcb16dde835961452da3fb825961812b39cf59f9126e93469f5643" gracePeriod=2 Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.483016 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" podUID="7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.99:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.483073 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.483153 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.630041 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" podUID="523c06cc-9816-4252-ac00-dc7928dae009" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.630041 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.630351 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.713010 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" podUID="b6faaca6-f017-42ac-95e4-d73ae3e8e519" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.713341 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" podUID="5e7b4134-2b34-4b36-ad61-8e681df197df" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.754073 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.754065 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.804986 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" podUID="84fc908a-9418-4e6e-ac17-9e725524f9ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.805170 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 12:39:30 crc kubenswrapper[4702]: I1203 12:39:30.896052 4702 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.007967 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" podUID="5edf270b-74cb-42d2-82dc-7953f243c6dc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.008160 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.009348 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" podUID="b877c7a7-0b88-4238-8a21-314ef1525996" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.050987 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" podUID="e3c1b694-60b8-4b5d-b8d5-40418e60aa4b" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: E1203 12:39:31.119063 4702 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.133010 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.133000 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.133205 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.174230 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.174257 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.174343 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.174443 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.215163 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podUID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.215240 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.215329 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.215327 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.311058 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.311135 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.335719 4702 patch_prober.go:28] interesting pod/logging-loki-distributor-76cc67bf56-xrnp2 container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.335991 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" podUID="508c1eef-dbbc-4c32-8d2e-dbb797c72461" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.391105 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="30059ea4-152f-420c-b8cc-234ebab96b47" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.251:8080/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.391212 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="30059ea4-152f-420c-b8cc-234ebab96b47" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.251:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.454072 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" event={"ID":"672e4a37-26c7-4378-a524-57fba88aec53","Type":"ContainerStarted","Data":"146610c584f2d67ef0b1ecb7329e696a48ac2af618bf0263b4f3f525be09f808"} Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.456217 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller" containerStatusID={"Type":"cri-o","ID":"e1bc297d19a6929404aa86b33b8e2ccef3114dd77f28c5099de3bc066165fec0"} pod="metallb-system/controller-f8648f98b-2npsf" containerMessage="Container controller failed liveness probe, will be restarted" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.456302 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" containerID="cri-o://e1bc297d19a6929404aa86b33b8e2ccef3114dd77f28c5099de3bc066165fec0" gracePeriod=2 Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.526713 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.575456 4702 patch_prober.go:28] interesting pod/logging-loki-querier-5895d59bb8-bhqrp container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.575528 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" podUID="042cc406-7960-493a-a19a-cb5590f8ff1f" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.54:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.672007 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.703174 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.703401 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.797030 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.804835 4702 patch_prober.go:28] interesting pod/logging-loki-query-frontend-84558f7c9f-dflgw container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.804930 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" podUID="dbea18cb-2e45-4d86-bc00-17a82f0a78ff" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.55:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.847990 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" podUID="84fc908a-9418-4e6e-ac17-9e725524f9ce" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.940910 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.943067 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.943139 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.963980 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": context deadline exceeded" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.964063 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": context deadline exceeded" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.998311 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.998407 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.998485 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:31 crc kubenswrapper[4702]: I1203 12:39:31.998574 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.016736 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.016820 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.016959 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.017024 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.174128 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.258088 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" podUID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.258111 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" podUID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.444040 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-76f68f8c78-w8n8j" podUID="36f29f89-b01b-4656-ba86-a2f731d0c1e0" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.201:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.444054 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76f68f8c78-w8n8j" podUID="36f29f89-b01b-4656-ba86-a2f731d0c1e0" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.201:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.466738 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.466807 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.469504 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" event={"ID":"5cecb29f-7ef9-4177-8e01-a776b70bbb03","Type":"ContainerDied","Data":"d4e57b0ce090fa77b9b12fade4f80ac494f3f9b623fb0b097b162890b3d80d10"} Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.469906 4702 generic.go:334] "Generic (PLEG): container finished" podID="5cecb29f-7ef9-4177-8e01-a776b70bbb03" containerID="d4e57b0ce090fa77b9b12fade4f80ac494f3f9b623fb0b097b162890b3d80d10" exitCode=1 Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.470387 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.471629 4702 scope.go:117] "RemoveContainer" containerID="d4e57b0ce090fa77b9b12fade4f80ac494f3f9b623fb0b097b162890b3d80d10" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.484953 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-76f68f8c78-w8n8j" podUID="36f29f89-b01b-4656-ba86-a2f731d0c1e0" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.201:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.485027 4702 patch_prober.go:28] interesting pod/perses-operator-5446b9c989-vpr8z container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.485101 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-76f68f8c78-w8n8j" podUID="36f29f89-b01b-4656-ba86-a2f731d0c1e0" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.201:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.485110 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" podUID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.14:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.527638 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.527770 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.760095 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.760176 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.854251 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.854346 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.854509 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.854540 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.854624 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/speaker-bv8pf" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.854917 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.855231 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bv8pf" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.857219 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="speaker" containerStatusID={"Type":"cri-o","ID":"9b6f385cb92b755ea3c8cd87f0ee9d9d2ee381047ee19125e24be8ca6e76a3d6"} pod="metallb-system/speaker-bv8pf" containerMessage="Container speaker failed liveness probe, will be restarted" Dec 03 12:39:32 crc kubenswrapper[4702]: I1203 12:39:32.857336 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" containerID="cri-o://9b6f385cb92b755ea3c8cd87f0ee9d9d2ee381047ee19125e24be8ca6e76a3d6" gracePeriod=2 Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.069814 4702 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.070792 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="358cd791-ccf7-4655-b446-b800598e773c" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.070900 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.084807 4702 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.81:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.084895 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="07e2709a-6aac-4b21-8fc8-bfc21992aae3" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.81:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.085009 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.131049 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.131360 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.131066 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.131467 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.131508 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.131581 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.132516 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus-operator-admission-webhook" containerStatusID={"Type":"cri-o","ID":"8c83b33dbc7dd119d0b2ca7b2933d6a165caba4682749655a9e5a2534e77d2e0"} pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" containerMessage="Container prometheus-operator-admission-webhook failed liveness probe, will be restarted" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.132557 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" containerID="cri-o://8c83b33dbc7dd119d0b2ca7b2933d6a165caba4682749655a9e5a2534e77d2e0" gracePeriod=30 Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.215037 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-f8648f98b-2npsf" podUID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.95:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.483955 4702 generic.go:334] "Generic (PLEG): container finished" podID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerID="958529641adcb16dde835961452da3fb825961812b39cf59f9126e93469f5643" exitCode=137 Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.484047 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" event={"ID":"2cb93136-1d69-4bc8-9c42-aee1f6638aa6","Type":"ContainerDied","Data":"958529641adcb16dde835961452da3fb825961812b39cf59f9126e93469f5643"} Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.487657 4702 generic.go:334] "Generic (PLEG): container finished" podID="1d86df9d-86a7-4980-abd0-488d98f6b2fb" containerID="ff2e288d038ac5bd4e6dbbd98fd8278b38b12e1e35384bae5a54dcddc68654c1" exitCode=1 Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.487721 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" event={"ID":"1d86df9d-86a7-4980-abd0-488d98f6b2fb","Type":"ContainerDied","Data":"ff2e288d038ac5bd4e6dbbd98fd8278b38b12e1e35384bae5a54dcddc68654c1"} Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.489110 4702 scope.go:117] "RemoveContainer" containerID="ff2e288d038ac5bd4e6dbbd98fd8278b38b12e1e35384bae5a54dcddc68654c1" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.492623 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9/ovn-northd/0.log" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.492814 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9","Type":"ContainerStarted","Data":"88eabf5215338ea8ea6833c7051368364576ed69deda9ce9ded5c1c71e754ea8"} Dec 03 12:39:33 crc kubenswrapper[4702]: E1203 12:39:33.493784 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038 is running failed: container process not found" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:33 crc kubenswrapper[4702]: E1203 12:39:33.494573 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038 is running failed: container process not found" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:33 crc kubenswrapper[4702]: E1203 12:39:33.494869 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038 is running failed: container process not found" containerID="95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 12:39:33 crc kubenswrapper[4702]: E1203 12:39:33.494909 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 95c785befd846128dc84efe58e8b50215dbf5d7e298f228a2d566c89f10ec038 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9" containerName="ovn-northd" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.529296 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" probeResult="failure" output="command timed out" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.638857 4702 patch_prober.go:28] interesting pod/thanos-querier-58bc8556f9-kwpsl container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.74:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.639007 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-58bc8556f9-kwpsl" podUID="b2347d45-1235-4ed0-9f48-6e0eb4c781f8" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.74:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.747079 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.747250 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="hostpath-provisioner/csi-hostpathplugin-pscld" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.748839 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="hostpath-provisioner" containerStatusID={"Type":"cri-o","ID":"7d68b0986741379ab8deb19834da932ca36898d987fd53602fe7466e084f46aa"} pod="hostpath-provisioner/csi-hostpathplugin-pscld" containerMessage="Container hostpath-provisioner failed liveness probe, will be restarted" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.749010 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" containerID="cri-o://7d68b0986741379ab8deb19834da932ca36898d987fd53602fe7466e084f46aa" gracePeriod=30 Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.856378 4702 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.856471 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6ce47478-68cc-46a9-99c3-cb20947e63c5" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.75:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.942017 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.982002 4702 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5xk7q container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:33 crc kubenswrapper[4702]: I1203 12:39:33.982085 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" podUID="adc43b14-86cb-4ff5-b7fb-a9ba32cde631" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.026674 4702 patch_prober.go:28] interesting pod/nmstate-webhook-5f6d4c5ccb-r6jd6 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.87:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.026769 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" podUID="9370f81f-7868-4a16-9cec-7786257cdcbd" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.87:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.026867 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.072833 4702 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.072975 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="358cd791-ccf7-4655-b446-b800598e773c" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.80:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.086740 4702 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.81:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.086851 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="07e2709a-6aac-4b21-8fc8-bfc21992aae3" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.81:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.132694 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.132796 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.514288 4702 generic.go:334] "Generic (PLEG): container finished" podID="d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c" containerID="e1bc297d19a6929404aa86b33b8e2ccef3114dd77f28c5099de3bc066165fec0" exitCode=0 Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.514366 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-2npsf" event={"ID":"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c","Type":"ContainerDied","Data":"e1bc297d19a6929404aa86b33b8e2ccef3114dd77f28c5099de3bc066165fec0"} Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.518633 4702 generic.go:334] "Generic (PLEG): container finished" podID="6e99cffd-b82e-46c9-8cbd-fe8c24507385" containerID="1a4aee4e3ee6bb158f231c543ab42ab038db6c887dc5b836c478eebbc236650b" exitCode=1 Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.518701 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" event={"ID":"6e99cffd-b82e-46c9-8cbd-fe8c24507385","Type":"ContainerDied","Data":"1a4aee4e3ee6bb158f231c543ab42ab038db6c887dc5b836c478eebbc236650b"} Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.519907 4702 scope.go:117] "RemoveContainer" containerID="1a4aee4e3ee6bb158f231c543ab42ab038db6c887dc5b836c478eebbc236650b" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.522342 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" event={"ID":"5cecb29f-7ef9-4177-8e01-a776b70bbb03","Type":"ContainerStarted","Data":"5133547a1d183c9316a16f1f263d00bdf3f5b802ed5ba1f6f360282660d59d10"} Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.523075 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.548968 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.549090 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.549595 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-zxwqw" podUID="9ec2138c-eb31-401f-b62d-d2823fe0523f" containerName="nmstate-handler" probeResult="failure" output="command timed out" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.549946 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.549977 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.550400 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89"} pod="openstack-operators/openstack-operator-index-ds4ss" containerMessage="Container registry-server failed liveness probe, will be restarted" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.550438 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" containerID="cri-o://c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89" gracePeriod=30 Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.550585 4702 generic.go:334] "Generic (PLEG): container finished" podID="a7faac4b-b558-4106-af27-4daf6a1db1af" containerID="2592b010473ade5473c405f0122b4c1439f84cc64cba400976159766ced15210" exitCode=1 Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.550627 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" event={"ID":"a7faac4b-b558-4106-af27-4daf6a1db1af","Type":"ContainerDied","Data":"2592b010473ade5473c405f0122b4c1439f84cc64cba400976159766ced15210"} Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.553729 4702 scope.go:117] "RemoveContainer" containerID="2592b010473ade5473c405f0122b4c1439f84cc64cba400976159766ced15210" Dec 03 12:39:34 crc kubenswrapper[4702]: E1203 12:39:34.554538 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:34 crc kubenswrapper[4702]: E1203 12:39:34.559194 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.561199 4702 generic.go:334] "Generic (PLEG): container finished" podID="5e7b4134-2b34-4b36-ad61-8e681df197df" containerID="e5d93f7c2aeae05d5b5a0857770119288cc89f383174e584fbe8093a8a53fb6f" exitCode=1 Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.562069 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" event={"ID":"5e7b4134-2b34-4b36-ad61-8e681df197df","Type":"ContainerDied","Data":"e5d93f7c2aeae05d5b5a0857770119288cc89f383174e584fbe8093a8a53fb6f"} Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.562957 4702 scope.go:117] "RemoveContainer" containerID="e5d93f7c2aeae05d5b5a0857770119288cc89f383174e584fbe8093a8a53fb6f" Dec 03 12:39:34 crc kubenswrapper[4702]: E1203 12:39:34.569900 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:34 crc kubenswrapper[4702]: E1203 12:39:34.569993 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.590834 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.590898 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.663043 4702 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-nx9wv container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.663102 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nx9wv" podUID="d7be37fc-7374-46ae-a0e3-1cafab3430ec" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.920956 4702 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5xk7q container/openshift-apiserver namespace/openshift-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.7:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.921035 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-apiserver/apiserver-76f77b778f-5xk7q" podUID="adc43b14-86cb-4ff5-b7fb-a9ba32cde631" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.944750 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 03 12:39:34 crc kubenswrapper[4702]: I1203 12:39:34.944836 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.028175 4702 patch_prober.go:28] interesting pod/nmstate-webhook-5f6d4c5ccb-r6jd6 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.87:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.028249 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" podUID="9370f81f-7868-4a16-9cec-7786257cdcbd" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.87:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.059849 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:36934->192.168.126.11:10257: read: connection reset by peer" start-of-body= Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.060229 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": read tcp 192.168.126.11:36934->192.168.126.11:10257: read: connection reset by peer" Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.060286 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.061349 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"340fda11da7dc96a53d9de3d8081f19331c382e2d43554b23914b4431b55cf34"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed liveness probe, will be restarted" Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.061456 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://340fda11da7dc96a53d9de3d8081f19331c382e2d43554b23914b4431b55cf34" gracePeriod=30 Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.529046 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.529136 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.530367 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:35 crc kubenswrapper[4702]: I1203 12:39:35.530455 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.530553 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-engine-79b68c69ff-kvztw" podUID="a30c0f33-d7bc-456c-be27-26e860ca8f28" containerName="heat-engine" probeResult="failure" output="command timed out" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.530605 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-engine-79b68c69ff-kvztw" podUID="a30c0f33-d7bc-456c-be27-26e860ca8f28" containerName="heat-engine" probeResult="failure" output="command timed out" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.576844 4702 generic.go:334] "Generic (PLEG): container finished" podID="4be204bf-b480-4d77-9ced-34c6668afa14" containerID="9b6f385cb92b755ea3c8cd87f0ee9d9d2ee381047ee19125e24be8ca6e76a3d6" exitCode=137 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.576940 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bv8pf" event={"ID":"4be204bf-b480-4d77-9ced-34c6668afa14","Type":"ContainerDied","Data":"9b6f385cb92b755ea3c8cd87f0ee9d9d2ee381047ee19125e24be8ca6e76a3d6"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.581935 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" event={"ID":"2cb93136-1d69-4bc8-9c42-aee1f6638aa6","Type":"ContainerStarted","Data":"57da388b3f845caa3a423e970bc7fe3a15c0445d5192a5758c5f92e543c8396c"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.582021 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.584787 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" event={"ID":"1d86df9d-86a7-4980-abd0-488d98f6b2fb","Type":"ContainerStarted","Data":"8145cadc277f8fc425f78a81d96336891e2c018553e95043c5e1124ffa4fd2a0"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.585481 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.588230 4702 generic.go:334] "Generic (PLEG): container finished" podID="9b295e92-630f-4544-b741-50ece5e79f4c" containerID="f87a5729b99dd275d68e1d993ff34e33b89e55c2796b91208a6d8802a0d3eae1" exitCode=1 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.588257 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" event={"ID":"9b295e92-630f-4544-b741-50ece5e79f4c","Type":"ContainerDied","Data":"f87a5729b99dd275d68e1d993ff34e33b89e55c2796b91208a6d8802a0d3eae1"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.588786 4702 scope.go:117] "RemoveContainer" containerID="f87a5729b99dd275d68e1d993ff34e33b89e55c2796b91208a6d8802a0d3eae1" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.591348 4702 generic.go:334] "Generic (PLEG): container finished" podID="5edf270b-74cb-42d2-82dc-7953f243c6dc" containerID="7efca528e823b01b4fd40beeaaf722a407970beb3251c2dd80e2cd6d65ee167a" exitCode=1 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.591403 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" event={"ID":"5edf270b-74cb-42d2-82dc-7953f243c6dc","Type":"ContainerDied","Data":"7efca528e823b01b4fd40beeaaf722a407970beb3251c2dd80e2cd6d65ee167a"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.592292 4702 scope.go:117] "RemoveContainer" containerID="7efca528e823b01b4fd40beeaaf722a407970beb3251c2dd80e2cd6d65ee167a" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.594431 4702 generic.go:334] "Generic (PLEG): container finished" podID="224e5de0-3f58-4243-80e5-212cf016ea46" containerID="179dbf9944d12c5daada8458b3a3d69ede690ca68d5ca87448c664ec2c4efe99" exitCode=1 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.594497 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" event={"ID":"224e5de0-3f58-4243-80e5-212cf016ea46","Type":"ContainerDied","Data":"179dbf9944d12c5daada8458b3a3d69ede690ca68d5ca87448c664ec2c4efe99"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.596874 4702 generic.go:334] "Generic (PLEG): container finished" podID="4b90477f-d1b5-4f03-ab08-2476d44a9cff" containerID="5be622def27ef22f65bbc45ceb45e75aff6e6caa3a727b8330e5e381c345f18f" exitCode=1 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.596925 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" event={"ID":"4b90477f-d1b5-4f03-ab08-2476d44a9cff","Type":"ContainerDied","Data":"5be622def27ef22f65bbc45ceb45e75aff6e6caa3a727b8330e5e381c345f18f"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.600905 4702 generic.go:334] "Generic (PLEG): container finished" podID="afc37ae6-c944-4cb1-81b6-c810ea1c3b31" containerID="a5f052b45bce1deebc15a1f99f27c9cd83479bc87073cdc3ea74ce69db473dbc" exitCode=1 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.600997 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" event={"ID":"afc37ae6-c944-4cb1-81b6-c810ea1c3b31","Type":"ContainerDied","Data":"a5f052b45bce1deebc15a1f99f27c9cd83479bc87073cdc3ea74ce69db473dbc"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.602098 4702 scope.go:117] "RemoveContainer" containerID="a5f052b45bce1deebc15a1f99f27c9cd83479bc87073cdc3ea74ce69db473dbc" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.650178 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:35.650587 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.131383 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.131464 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.131571 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.132075 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28"} pod="openshift-marketplace/redhat-marketplace-6sd9l" containerMessage="Container registry-server failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.132176 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" containerID="cri-o://c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28" gracePeriod=30 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.135026 4702 scope.go:117] "RemoveContainer" containerID="5be622def27ef22f65bbc45ceb45e75aff6e6caa3a727b8330e5e381c345f18f" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.135583 4702 scope.go:117] "RemoveContainer" containerID="179dbf9944d12c5daada8458b3a3d69ede690ca68d5ca87448c664ec2c4efe99" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.215075 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.215159 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.215266 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.215775 4702 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-v6p66 container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.215838 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" podUID="5bad766d-e524-4670-b353-56e92df2f744" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.215889 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.217799 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"d3538527ec1c0eff1cf367faea77ee3efdefc14633dab42c7a824a6b5e81c792"} pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.217848 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" podUID="5bad766d-e524-4670-b353-56e92df2f744" containerName="authentication-operator" containerID="cri-o://d3538527ec1c0eff1cf367faea77ee3efdefc14633dab42c7a824a6b5e81c792" gracePeriod=30 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.258937 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.258997 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.259008 4702 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-wh75l" podUID="7643d370-6497-4a94-b0e7-2db66b56b687" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.259052 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.377985 4702 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378001 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378051 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378096 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378109 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378143 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378068 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378177 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378225 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378204 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378281 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378299 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378325 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378340 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378373 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378240 4702 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5688675f7c-q6w79 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": dial tcp 10.217.0.48:8081: connect: connection refused" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378952 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" podUID="672e4a37-26c7-4378-a524-57fba88aec53" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": dial tcp 10.217.0.48:8081: connect: connection refused" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.378830 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.379003 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.379039 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.379740 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"a0c95ae1cc071401df4dbae2a9b4cb0862a47a5d82d5152774e1cd848052af53"} pod="openshift-ingress/router-default-5444994796-x85q2" containerMessage="Container router failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.379799 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" containerID="cri-o://a0c95ae1cc071401df4dbae2a9b4cb0862a47a5d82d5152774e1cd848052af53" gracePeriod=10 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.380203 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"a7bc8e70a7c69f53a453f3a91be933b6d9c561768bed1ec433370f217d06d90f"} pod="openshift-console-operator/console-operator-58897d9998-nnrsp" containerMessage="Container console-operator failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.380258 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" containerID="cri-o://a7bc8e70a7c69f53a453f3a91be933b6d9c561768bed1ec433370f217d06d90f" gracePeriod=30 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.531215 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.531769 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.538807 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc"} pod="openshift-marketplace/redhat-operators-qhss9" containerMessage="Container registry-server failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.539078 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" containerID="cri-o://18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc" gracePeriod=30 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.540346 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.539503 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" probeResult="failure" output="command timed out" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.542538 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.545303 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus" containerStatusID={"Type":"cri-o","ID":"57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d"} pod="openshift-monitoring/prometheus-k8s-0" containerMessage="Container prometheus failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.545480 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" containerID="cri-o://57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" gracePeriod=600 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.553208 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:36 crc kubenswrapper[4702]: E1203 12:39:36.553325 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.555200 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.555317 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 12:39:36 crc kubenswrapper[4702]: E1203 12:39:36.556188 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:36 crc kubenswrapper[4702]: E1203 12:39:36.559349 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:36 crc kubenswrapper[4702]: E1203 12:39:36.559390 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.565383 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.565480 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.565589 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.567455 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.567488 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.567520 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.646057 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" event={"ID":"a7faac4b-b558-4106-af27-4daf6a1db1af","Type":"ContainerStarted","Data":"5ad566832e973d311e5ad5858631affa4ee0d6d4c0a00943461699722570d1af"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.646280 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.658709 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" event={"ID":"5e7b4134-2b34-4b36-ad61-8e681df197df","Type":"ContainerStarted","Data":"d0006de3564e4ae4b856c047b1f6148176367f2090176580ec014080e1457b58"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.659121 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.662620 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.672160 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.672234 4702 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="340fda11da7dc96a53d9de3d8081f19331c382e2d43554b23914b4431b55cf34" exitCode=1 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.672342 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"340fda11da7dc96a53d9de3d8081f19331c382e2d43554b23914b4431b55cf34"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.672422 4702 scope.go:117] "RemoveContainer" containerID="012f2e5bc677c401e1f769f8fc6f51b21d02c7e5586f3ac12c074ebf4dbd1132" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.676111 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-2npsf" event={"ID":"d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c","Type":"ContainerStarted","Data":"57f84f67f3ac83aee0dddff7433b61a345a1a1ea70ec7ac92bc944b2e1d62646"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.676210 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.678637 4702 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-vchd7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.678680 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.678724 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.678860 4702 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-vchd7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.678921 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.679013 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.680066 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="package-server-manager" containerStatusID={"Type":"cri-o","ID":"1626abe3ed15a198a4e566a115de6e6d881278876d92ba929f1591bfe5f92455"} pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" containerMessage="Container package-server-manager failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.680107 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" containerID="cri-o://1626abe3ed15a198a4e566a115de6e6d881278876d92ba929f1591bfe5f92455" gracePeriod=30 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.680228 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" event={"ID":"6e99cffd-b82e-46c9-8cbd-fe8c24507385","Type":"ContainerStarted","Data":"1596e4d8cefb465e0a75dd3e1e60642220efc5e5f201451c4606dd22956f86e5"} Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.682978 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"1fa17a686806b34438984f5b5ca80dc2a3bcdaf1f0c62c12a6ee5d914c661e83"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" containerMessage="Container packageserver failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.683222 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" containerID="cri-o://1fa17a686806b34438984f5b5ca80dc2a3bcdaf1f0c62c12a6ee5d914c661e83" gracePeriod=30 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.684376 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="catalog-operator" containerStatusID={"Type":"cri-o","ID":"b13e93cc06e972149bc9f92319fb6d6d474c7a18829e3e1ce0afbfd02122c0c2"} pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" containerMessage="Container catalog-operator failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.684436 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" containerID="cri-o://b13e93cc06e972149bc9f92319fb6d6d474c7a18829e3e1ce0afbfd02122c0c2" gracePeriod=30 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.684736 4702 status_manager.go:317] "Container readiness changed for unknown container" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" containerID="cri-o://1a4aee4e3ee6bb158f231c543ab42ab038db6c887dc5b836c478eebbc236650b" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.684779 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.685337 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"ccb878a03690ecae62364ca11922e2d74ca8a555b1b3a95db9a68827c0d60357"} pod="openshift-console/downloads-7954f5f757-hndf6" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.685381 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" containerID="cri-o://ccb878a03690ecae62364ca11922e2d74ca8a555b1b3a95db9a68827c0d60357" gracePeriod=2 Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.702229 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.702358 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.702396 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.705284 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus" containerStatusID={"Type":"cri-o","ID":"c8a5edbe076c7fae36e6d71c1ba0fa39cc004e86bc1bc5274c67c0d804da550d"} pod="openstack/prometheus-metric-storage-0" containerMessage="Container prometheus failed liveness probe, will be restarted" Dec 03 12:39:36 crc kubenswrapper[4702]: I1203 12:39:36.705493 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" containerID="cri-o://c8a5edbe076c7fae36e6d71c1ba0fa39cc004e86bc1bc5274c67c0d804da550d" gracePeriod=600 Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.044956 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.045422 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.045865 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.045931 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.045963 4702 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.046025 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.046133 4702 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-8p7q4 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.61:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.046161 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" podUID="11c38a7f-7709-4e96-b309-c987c2301610" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.61:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.045972 4702 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-8p7q4 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.61:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.046550 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-8p7q4" podUID="11c38a7f-7709-4e96-b309-c987c2301610" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.61:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.088086 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.142318 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.142416 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.142459 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-68ktn container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.142572 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-68ktn" podUID="6cbbba51-9166-42cb-917c-7c634351e5c9" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.144386 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.144458 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.144556 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.146519 4702 patch_prober.go:28] interesting pod/metrics-server-6975dd785d-5bvc2 container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.146584 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" podUID="56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.76:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.146635 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.146907 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8081/ready\": context deadline exceeded" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.146955 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.57:8081/ready\": context deadline exceeded" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.147030 4702 patch_prober.go:28] interesting pod/logging-loki-gateway-76dff8487c-mdlcz container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.147045 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-76dff8487c-mdlcz" podUID="72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.57:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.147994 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"e36df22670a6d359cb2ddc47ec0abc22c2b0001877dbfb6610c77f76e758779e"} pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" containerMessage="Container metrics-server failed liveness probe, will be restarted" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.148045 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" podUID="56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff" containerName="metrics-server" containerID="cri-o://e36df22670a6d359cb2ddc47ec0abc22c2b0001877dbfb6610c77f76e758779e" gracePeriod=170 Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.152151 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.152217 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.152265 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.155074 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.155144 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.260216 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.261850 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.361531 4702 patch_prober.go:28] interesting pod/monitoring-plugin-74f4cdd6c8-9czrj container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.361943 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" podUID="7ff9bb87-9ede-4d63-a2f5-5d1c49b30c29" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.77:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.362121 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.380973 4702 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.381053 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.423041 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.423138 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.423221 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.423242 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.423742 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.423827 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.424365 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.424400 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.546463 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-74f4cdd6c8-9czrj" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.688283 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": EOF" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.688732 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": EOF" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.704393 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" event={"ID":"4b90477f-d1b5-4f03-ab08-2476d44a9cff","Type":"ContainerStarted","Data":"4b3a6ab60007f004cb208f10fbc444b49bc018e5791d2af26faf3da152c5f5f7"} Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.713981 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" event={"ID":"afc37ae6-c944-4cb1-81b6-c810ea1c3b31","Type":"ContainerStarted","Data":"d6af96b1489b3fc7678a70e2a5cdb8588fcdb6d6e7ba073a5efaca6214e1956c"} Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.718167 4702 generic.go:334] "Generic (PLEG): container finished" podID="b877c7a7-0b88-4238-8a21-314ef1525996" containerID="0e840e9da35d051022fe6af1a772dec9473e404d679a1bd2ed6b9aad3fa41562" exitCode=1 Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.718236 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" event={"ID":"b877c7a7-0b88-4238-8a21-314ef1525996","Type":"ContainerDied","Data":"0e840e9da35d051022fe6af1a772dec9473e404d679a1bd2ed6b9aad3fa41562"} Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.719438 4702 scope.go:117] "RemoveContainer" containerID="0e840e9da35d051022fe6af1a772dec9473e404d679a1bd2ed6b9aad3fa41562" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.742033 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" event={"ID":"5edf270b-74cb-42d2-82dc-7953f243c6dc","Type":"ContainerStarted","Data":"a2d62d78caf27f9a7d41aebfcce9c3896f7855049817100db353a0f36d7c8545"} Dec 03 12:39:37 crc kubenswrapper[4702]: E1203 12:39:37.744241 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.747739 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" event={"ID":"224e5de0-3f58-4243-80e5-212cf016ea46","Type":"ContainerStarted","Data":"85536994e2324c9039a03959eaa46b0c425c772eba1d13fb1dba2231eda574f7"} Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.752212 4702 generic.go:334] "Generic (PLEG): container finished" podID="7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3" containerID="36284dcfeefe99a9bfddd6167ac936637f43293c085fecce5a3fb65e4f6c9a6d" exitCode=1 Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.752349 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" event={"ID":"7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3","Type":"ContainerDied","Data":"36284dcfeefe99a9bfddd6167ac936637f43293c085fecce5a3fb65e4f6c9a6d"} Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.753393 4702 scope.go:117] "RemoveContainer" containerID="36284dcfeefe99a9bfddd6167ac936637f43293c085fecce5a3fb65e4f6c9a6d" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.761650 4702 generic.go:334] "Generic (PLEG): container finished" podID="ec0726c3-58ef-4a22-8e00-bae32d7d66ca" containerID="57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca" exitCode=0 Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.761712 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" event={"ID":"ec0726c3-58ef-4a22-8e00-bae32d7d66ca","Type":"ContainerDied","Data":"57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca"} Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.765698 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.772068 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" event={"ID":"9b295e92-630f-4544-b741-50ece5e79f4c","Type":"ContainerStarted","Data":"b205d0c58caf689eb6393678f9624fae9ca23c93597e7b8c9212f03abf298931"} Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.773085 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.773681 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="olm-operator" containerStatusID={"Type":"cri-o","ID":"9a3f99c658871b2d5b7276c4079ae2bd142c2137caf9aba99bf227ed580bf8ea"} pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" containerMessage="Container olm-operator failed liveness probe, will be restarted" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.773735 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" containerID="cri-o://9a3f99c658871b2d5b7276c4079ae2bd142c2137caf9aba99bf227ed580bf8ea" gracePeriod=30 Dec 03 12:39:37 crc kubenswrapper[4702]: E1203 12:39:37.794716 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:37 crc kubenswrapper[4702]: E1203 12:39:37.797230 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:37 crc kubenswrapper[4702]: E1203 12:39:37.797317 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.942996 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 03 12:39:37 crc kubenswrapper[4702]: I1203 12:39:37.943496 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.144152 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.276733 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 12:39:38 crc kubenswrapper[4702]: E1203 12:39:38.290640 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6faaca6_f017_42ac_95e4_d73ae3e8e519.slice/crio-conmon-9fc58cd58b7eec4e8d1c0d978efb544179b4db4a16029cba91aa2efa1b5d14dc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6edf33c_728d_482f_ad5c_ceb85dae3b75.slice/crio-conmon-a7bc8e70a7c69f53a453f3a91be933b6d9c561768bed1ec433370f217d06d90f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.303172 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.303235 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.303554 4702 patch_prober.go:28] interesting pod/route-controller-manager-84cf75c7c5-96cmd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.303690 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.424784 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.462162 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podUID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/healthz\": dial tcp 10.217.0.103:8081: connect: connection refused" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.462341 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podUID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": dial tcp 10.217.0.103:8081: connect: connection refused" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.462709 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.463438 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" podUID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": dial tcp 10.217.0.103:8081: connect: connection refused" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.530493 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.530516 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.530896 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.531036 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.532027 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.532095 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output="command timed out" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.532290 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.532443 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.532746 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10"} pod="openshift-marketplace/certified-operators-7ntqc" containerMessage="Container registry-server failed liveness probe, will be restarted" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.532854 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" containerID="cri-o://a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10" gracePeriod=30 Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.534115 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d"} pod="openshift-marketplace/community-operators-m6cqt" containerMessage="Container registry-server failed liveness probe, will be restarted" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.534263 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" containerID="cri-o://fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d" gracePeriod=30 Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.576783 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" podUID="1a7e4f08-8a48-44d5-944b-4eaf9d9518b5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/healthz\": dial tcp 10.217.0.105:8081: connect: connection refused" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.576969 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" podUID="1a7e4f08-8a48-44d5-944b-4eaf9d9518b5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": dial tcp 10.217.0.105:8081: connect: connection refused" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.577039 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.578458 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.578510 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.580668 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" podUID="1a7e4f08-8a48-44d5-944b-4eaf9d9518b5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": dial tcp 10.217.0.105:8081: connect: connection refused" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.636478 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.636560 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.644915 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.644980 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.682614 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-hxvr6" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.731633 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.735133 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.790975 4702 generic.go:334] "Generic (PLEG): container finished" podID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerID="b13e93cc06e972149bc9f92319fb6d6d474c7a18829e3e1ce0afbfd02122c0c2" exitCode=0 Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.791090 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" event={"ID":"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43","Type":"ContainerDied","Data":"b13e93cc06e972149bc9f92319fb6d6d474c7a18829e3e1ce0afbfd02122c0c2"} Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.851136 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-nnrsp_c6edf33c-728d-482f-ad5c-ceb85dae3b75/console-operator/0.log" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.851810 4702 generic.go:334] "Generic (PLEG): container finished" podID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerID="a7bc8e70a7c69f53a453f3a91be933b6d9c561768bed1ec433370f217d06d90f" exitCode=1 Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.852079 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" event={"ID":"c6edf33c-728d-482f-ad5c-ceb85dae3b75","Type":"ContainerDied","Data":"a7bc8e70a7c69f53a453f3a91be933b6d9c561768bed1ec433370f217d06d90f"} Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.866316 4702 generic.go:334] "Generic (PLEG): container finished" podID="84fc908a-9418-4e6e-ac17-9e725524f9ce" containerID="d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c" exitCode=1 Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.866669 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" event={"ID":"84fc908a-9418-4e6e-ac17-9e725524f9ce","Type":"ContainerDied","Data":"d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c"} Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.867938 4702 scope.go:117] "RemoveContainer" containerID="d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.878327 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" event={"ID":"ec0726c3-58ef-4a22-8e00-bae32d7d66ca","Type":"ContainerStarted","Data":"4ba3074af13a68d705ed4738cca52423521406e398260644f109e7bcc711b70a"} Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.878469 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.886088 4702 generic.go:334] "Generic (PLEG): container finished" podID="b6faaca6-f017-42ac-95e4-d73ae3e8e519" containerID="9fc58cd58b7eec4e8d1c0d978efb544179b4db4a16029cba91aa2efa1b5d14dc" exitCode=1 Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.886152 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" event={"ID":"b6faaca6-f017-42ac-95e4-d73ae3e8e519","Type":"ContainerDied","Data":"9fc58cd58b7eec4e8d1c0d978efb544179b4db4a16029cba91aa2efa1b5d14dc"} Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.886639 4702 scope.go:117] "RemoveContainer" containerID="9fc58cd58b7eec4e8d1c0d978efb544179b4db4a16029cba91aa2efa1b5d14dc" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.897007 4702 generic.go:334] "Generic (PLEG): container finished" podID="1a7e4f08-8a48-44d5-944b-4eaf9d9518b5" containerID="9b12470bed66eccb99c82655a1ddea59de204202edd291476d12f76d6b0aec2d" exitCode=1 Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.897122 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" event={"ID":"1a7e4f08-8a48-44d5-944b-4eaf9d9518b5","Type":"ContainerDied","Data":"9b12470bed66eccb99c82655a1ddea59de204202edd291476d12f76d6b0aec2d"} Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.898225 4702 scope.go:117] "RemoveContainer" containerID="9b12470bed66eccb99c82655a1ddea59de204202edd291476d12f76d6b0aec2d" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.901878 4702 generic.go:334] "Generic (PLEG): container finished" podID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerID="8c83b33dbc7dd119d0b2ca7b2933d6a165caba4682749655a9e5a2534e77d2e0" exitCode=0 Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.903063 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" event={"ID":"44af00fd-b9f6-4e74-ad67-581e4ca7527c","Type":"ContainerDied","Data":"8c83b33dbc7dd119d0b2ca7b2933d6a165caba4682749655a9e5a2534e77d2e0"} Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.903099 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.903779 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.904853 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.905299 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 12:39:38 crc kubenswrapper[4702]: I1203 12:39:38.906063 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.054203 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output="" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.085281 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjmph"] Dec 03 12:39:39 crc kubenswrapper[4702]: E1203 12:39:39.086122 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:39 crc kubenswrapper[4702]: E1203 12:39:39.120104 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:39 crc kubenswrapper[4702]: E1203 12:39:39.129228 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:39 crc kubenswrapper[4702]: E1203 12:39:39.129297 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack-operators/openstack-operator-index-ds4ss" podUID="9432a2a8-8932-4734-a69d-8976764f1dab" containerName="registry-server" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.508550 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" podUID="523c06cc-9816-4252-ac00-dc7928dae009" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": dial tcp 10.217.0.113:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.508938 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.508974 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": dial tcp 10.217.0.115:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.508567 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" podUID="523c06cc-9816-4252-ac00-dc7928dae009" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": dial tcp 10.217.0.113:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.509125 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" podUID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": dial tcp 10.217.0.115:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.509451 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" podUID="523c06cc-9816-4252-ac00-dc7928dae009" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": dial tcp 10.217.0.113:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.558279 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": dial tcp 10.217.0.116:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.558335 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": dial tcp 10.217.0.116:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.558418 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.560042 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" podUID="8de75640-5551-4d04-830d-64f0fbb7847a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": dial tcp 10.217.0.116:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.573518 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.573571 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.600276 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.701135 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" podUID="5cecb29f-7ef9-4177-8e01-a776b70bbb03" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": dial tcp 10.217.0.110:8081: connect: connection refused" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.727650 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.924523 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bv8pf" event={"ID":"4be204bf-b480-4d77-9ced-34c6668afa14","Type":"ContainerStarted","Data":"eec789a45cf63bc55c57a73a838951dd8d979839098dac5e3a2b8897531a3247"} Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.927879 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.929231 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6fdcf94290d3865dc746e2dbc4431812263eaef09704c7424601bf38e7de9684"} Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.932293 4702 generic.go:334] "Generic (PLEG): container finished" podID="ae6dac10-29ba-4bb8-8a0c-68a2bad519af" containerID="17945cb0a41c87bbba4777c9e2febdbd11b845c886b7607fc60a0bbf1d204637" exitCode=1 Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.932408 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" event={"ID":"ae6dac10-29ba-4bb8-8a0c-68a2bad519af","Type":"ContainerDied","Data":"17945cb0a41c87bbba4777c9e2febdbd11b845c886b7607fc60a0bbf1d204637"} Dec 03 12:39:39 crc kubenswrapper[4702]: I1203 12:39:39.933312 4702 scope.go:117] "RemoveContainer" containerID="17945cb0a41c87bbba4777c9e2febdbd11b845c886b7607fc60a0bbf1d204637" Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.128340 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.143153 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" containerID="cri-o://9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290" gracePeriod=15 Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.375091 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" containerID="cri-o://c7ed9f1ef289cc5aacad80a6cda5674b5b2d84db5935b0d3f1326bc5cd93e425" gracePeriod=3 Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.415008 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-wh75l" Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.417674 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-76cc67bf56-xrnp2" Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.569164 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28 is running failed: container process not found" containerID="c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.570213 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28 is running failed: container process not found" containerID="c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.571087 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28 is running failed: container process not found" containerID="c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.571180 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.583311 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.648273 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-5895d59bb8-bhqrp" Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.659126 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6320ff_4661_46be_80e1_8d97f09fe789.slice/crio-conmon-04dfa9ccbc805c8c141db2c3940506e37b3216be0ec5cac7fc187c84a3ff08a6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.758316 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc is running failed: container process not found" containerID="18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.758655 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc is running failed: container process not found" containerID="18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.764502 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc is running failed: container process not found" containerID="18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:40 crc kubenswrapper[4702]: E1203 12:39:40.764595 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.822003 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-84558f7c9f-dflgw" Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.943202 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 03 12:39:40 crc kubenswrapper[4702]: I1203 12:39:40.943333 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.015627 4702 generic.go:334] "Generic (PLEG): container finished" podID="530ef793-9485-4c45-86ba-531906f2085a" containerID="473957d51e70605a3e125b6884cd25b6b52ad82ad8d2552f0a8232dd8adc2563" exitCode=1 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.015679 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" event={"ID":"530ef793-9485-4c45-86ba-531906f2085a","Type":"ContainerDied","Data":"473957d51e70605a3e125b6884cd25b6b52ad82ad8d2552f0a8232dd8adc2563"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.016967 4702 scope.go:117] "RemoveContainer" containerID="473957d51e70605a3e125b6884cd25b6b52ad82ad8d2552f0a8232dd8adc2563" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.038118 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.038150 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" event={"ID":"84fc908a-9418-4e6e-ac17-9e725524f9ce","Type":"ContainerStarted","Data":"9806996d50b3b867fdae9b76dcf2eb0578c2b1d00a70a2a5ce5d6d981a29d936"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.044514 4702 generic.go:334] "Generic (PLEG): container finished" podID="a456460b-47a3-48ef-a98b-4f67709d5939" containerID="56666e8a78a82b8a80949b132ba3edf357fbbc19b68d6b6c216a6290e3608589" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.044581 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" event={"ID":"a456460b-47a3-48ef-a98b-4f67709d5939","Type":"ContainerDied","Data":"56666e8a78a82b8a80949b132ba3edf357fbbc19b68d6b6c216a6290e3608589"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.056710 4702 generic.go:334] "Generic (PLEG): container finished" podID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerID="ccb878a03690ecae62364ca11922e2d74ca8a555b1b3a95db9a68827c0d60357" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.056898 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerDied","Data":"ccb878a03690ecae62364ca11922e2d74ca8a555b1b3a95db9a68827c0d60357"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.056944 4702 scope.go:117] "RemoveContainer" containerID="b19709c46ee9bdb2df0ae0064f1db24583ec1036f7312805bc2da46e7287368d" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.064652 4702 generic.go:334] "Generic (PLEG): container finished" podID="0276c6fb-ba7a-459f-9610-34a03593669b" containerID="9a3f99c658871b2d5b7276c4079ae2bd142c2137caf9aba99bf227ed580bf8ea" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.064743 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" event={"ID":"0276c6fb-ba7a-459f-9610-34a03593669b","Type":"ContainerDied","Data":"9a3f99c658871b2d5b7276c4079ae2bd142c2137caf9aba99bf227ed580bf8ea"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.075785 4702 generic.go:334] "Generic (PLEG): container finished" podID="9432a2a8-8932-4734-a69d-8976764f1dab" containerID="c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.075946 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ds4ss" event={"ID":"9432a2a8-8932-4734-a69d-8976764f1dab","Type":"ContainerDied","Data":"c61463db247d9a3ed644ddbb258dd0073cef8188651aafd572c73a0ead900b89"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.084423 4702 generic.go:334] "Generic (PLEG): container finished" podID="8f6320ff-4661-46be-80e1-8d97f09fe789" containerID="04dfa9ccbc805c8c141db2c3940506e37b3216be0ec5cac7fc187c84a3ff08a6" exitCode=1 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.084935 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" event={"ID":"8f6320ff-4661-46be-80e1-8d97f09fe789","Type":"ContainerDied","Data":"04dfa9ccbc805c8c141db2c3940506e37b3216be0ec5cac7fc187c84a3ff08a6"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.087343 4702 generic.go:334] "Generic (PLEG): container finished" podID="1d60d4ab-7bac-4fd1-9aad-c07ba1513d41" containerID="570ee7d47eaeeec5374d71e46a9426a01ca42ce6c943930d213b357636ee1310" exitCode=1 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.087413 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" event={"ID":"1d60d4ab-7bac-4fd1-9aad-c07ba1513d41","Type":"ContainerDied","Data":"570ee7d47eaeeec5374d71e46a9426a01ca42ce6c943930d213b357636ee1310"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.090980 4702 generic.go:334] "Generic (PLEG): container finished" podID="c43c86a0-692f-406f-871a-24a14f24ed77" containerID="3de49cac20fba2b5d89e36e8a1febef0771c4a2f6530b0b8118fd1204b4b3467" exitCode=1 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.091064 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" event={"ID":"c43c86a0-692f-406f-871a-24a14f24ed77","Type":"ContainerDied","Data":"3de49cac20fba2b5d89e36e8a1febef0771c4a2f6530b0b8118fd1204b4b3467"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.093868 4702 scope.go:117] "RemoveContainer" containerID="570ee7d47eaeeec5374d71e46a9426a01ca42ce6c943930d213b357636ee1310" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.094000 4702 scope.go:117] "RemoveContainer" containerID="04dfa9ccbc805c8c141db2c3940506e37b3216be0ec5cac7fc187c84a3ff08a6" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.094070 4702 scope.go:117] "RemoveContainer" containerID="3de49cac20fba2b5d89e36e8a1febef0771c4a2f6530b0b8118fd1204b4b3467" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.094142 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" event={"ID":"1a7e4f08-8a48-44d5-944b-4eaf9d9518b5","Type":"ContainerStarted","Data":"d8c60948246b842dbe4b5497de2c07dc83181edb004226dcefb6894a1b50416c"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.094449 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.098963 4702 generic.go:334] "Generic (PLEG): container finished" podID="5bad766d-e524-4670-b353-56e92df2f744" containerID="d3538527ec1c0eff1cf367faea77ee3efdefc14633dab42c7a824a6b5e81c792" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.099132 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" event={"ID":"5bad766d-e524-4670-b353-56e92df2f744","Type":"ContainerDied","Data":"d3538527ec1c0eff1cf367faea77ee3efdefc14633dab42c7a824a6b5e81c792"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.131054 4702 generic.go:334] "Generic (PLEG): container finished" podID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerID="c8a5edbe076c7fae36e6d71c1ba0fa39cc004e86bc1bc5274c67c0d804da550d" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.131355 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"90e9786f-3e0d-4a23-b624-b49a3d386784","Type":"ContainerDied","Data":"c8a5edbe076c7fae36e6d71c1ba0fa39cc004e86bc1bc5274c67c0d804da550d"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.139371 4702 generic.go:334] "Generic (PLEG): container finished" podID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerID="4263520eca4d7e37a135b540dd1b3cfa0adf6e70430cc456be62501d5508459b" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.139457 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" event={"ID":"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4","Type":"ContainerDied","Data":"4263520eca4d7e37a135b540dd1b3cfa0adf6e70430cc456be62501d5508459b"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.153870 4702 generic.go:334] "Generic (PLEG): container finished" podID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerID="97a0cf5f74bb303168d42c86dbc2d9abc4f6775d6c7ffd7df4510df0fb0a6fc2" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.153972 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" event={"ID":"fecafd1a-bd80-46ea-8839-dbfc2d364a96","Type":"ContainerDied","Data":"97a0cf5f74bb303168d42c86dbc2d9abc4f6775d6c7ffd7df4510df0fb0a6fc2"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.162085 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" event={"ID":"b6faaca6-f017-42ac-95e4-d73ae3e8e519","Type":"ContainerStarted","Data":"b7fbfe7c0bb6ea9caa27c45bdd4d27d52db9ecf2552c33ee3816f9e591537aa5"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.162224 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.174718 4702 generic.go:334] "Generic (PLEG): container finished" podID="523c06cc-9816-4252-ac00-dc7928dae009" containerID="55c2f2583dccdbef6194c35e1440f482b93fc94f3162da2f718bb964d13463b9" exitCode=1 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.174873 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" event={"ID":"523c06cc-9816-4252-ac00-dc7928dae009","Type":"ContainerDied","Data":"55c2f2583dccdbef6194c35e1440f482b93fc94f3162da2f718bb964d13463b9"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.176615 4702 scope.go:117] "RemoveContainer" containerID="55c2f2583dccdbef6194c35e1440f482b93fc94f3162da2f718bb964d13463b9" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.181119 4702 generic.go:334] "Generic (PLEG): container finished" podID="ea8c3262-d494-4427-8228-df9584c00ca1" containerID="c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.181210 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sd9l" event={"ID":"ea8c3262-d494-4427-8228-df9584c00ca1","Type":"ContainerDied","Data":"c5520925fdcb74e51555de4760801d54227957a1ec428e730058e63e89436c28"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.185036 4702 generic.go:334] "Generic (PLEG): container finished" podID="62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d" containerID="1e38392069abb63a138f8151a298d7c9f511dc25106660eba9fd42ce806c269b" exitCode=1 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.185070 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" event={"ID":"62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d","Type":"ContainerDied","Data":"1e38392069abb63a138f8151a298d7c9f511dc25106660eba9fd42ce806c269b"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.186057 4702 scope.go:117] "RemoveContainer" containerID="1e38392069abb63a138f8151a298d7c9f511dc25106660eba9fd42ce806c269b" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.187720 4702 generic.go:334] "Generic (PLEG): container finished" podID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerID="a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.187810 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ntqc" event={"ID":"38c7c63b-db59-4055-aee0-99ea082bd8f7","Type":"ContainerDied","Data":"a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.193508 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" event={"ID":"b877c7a7-0b88-4238-8a21-314ef1525996","Type":"ContainerStarted","Data":"2c49eb71351ea9f233ca5adfa84631a42299cb630c2447917240886e172d27b7"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.194074 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.201143 4702 generic.go:334] "Generic (PLEG): container finished" podID="480aa817-7d43-4ea8-9099-06bcb431e578" containerID="18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.201244 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhss9" event={"ID":"480aa817-7d43-4ea8-9099-06bcb431e578","Type":"ContainerDied","Data":"18ced4db46fb40fb222275c19c0f3d89fe0a63205b4c74d2c87e8c6f83308bbc"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.204590 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" event={"ID":"7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3","Type":"ContainerStarted","Data":"00c914215fbe87e91ff5fe953cf82941e3202e9d7644e85b744407b020a35e9e"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.204781 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.221974 4702 generic.go:334] "Generic (PLEG): container finished" podID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerID="fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.222047 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6cqt" event={"ID":"0f2dd872-6ac4-4527-9a91-218b1de5ed5e","Type":"ContainerDied","Data":"fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.225465 4702 generic.go:334] "Generic (PLEG): container finished" podID="3de04148-0009-427b-8055-a1c5dadb8274" containerID="1fa17a686806b34438984f5b5ca80dc2a3bcdaf1f0c62c12a6ee5d914c661e83" exitCode=0 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.225512 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" event={"ID":"3de04148-0009-427b-8055-a1c5dadb8274","Type":"ContainerDied","Data":"1fa17a686806b34438984f5b5ca80dc2a3bcdaf1f0c62c12a6ee5d914c661e83"} Dec 03 12:39:41 crc kubenswrapper[4702]: E1203 12:39:41.228481 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c7ed9f1ef289cc5aacad80a6cda5674b5b2d84db5935b0d3f1326bc5cd93e425" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.228781 4702 generic.go:334] "Generic (PLEG): container finished" podID="182ca1cb-9499-4cf7-aeae-c35c7038814c" containerID="6ef1184c4c68f7dc5e1d367ac85a42328ff44e1699b9a00747f3de549e96858b" exitCode=1 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.228835 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" event={"ID":"182ca1cb-9499-4cf7-aeae-c35c7038814c","Type":"ContainerDied","Data":"6ef1184c4c68f7dc5e1d367ac85a42328ff44e1699b9a00747f3de549e96858b"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.229381 4702 scope.go:117] "RemoveContainer" containerID="6ef1184c4c68f7dc5e1d367ac85a42328ff44e1699b9a00747f3de549e96858b" Dec 03 12:39:41 crc kubenswrapper[4702]: E1203 12:39:41.237300 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c7ed9f1ef289cc5aacad80a6cda5674b5b2d84db5935b0d3f1326bc5cd93e425" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.238786 4702 generic.go:334] "Generic (PLEG): container finished" podID="8de75640-5551-4d04-830d-64f0fbb7847a" containerID="90043828fd23987e4d8ab1daec86921e7428498330c4d66bbe0c66f1386100c2" exitCode=1 Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.239147 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" event={"ID":"8de75640-5551-4d04-830d-64f0fbb7847a","Type":"ContainerDied","Data":"90043828fd23987e4d8ab1daec86921e7428498330c4d66bbe0c66f1386100c2"} Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.241877 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-bv8pf" podUID="4be204bf-b480-4d77-9ced-34c6668afa14" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": dial tcp [::1]:29150: connect: connection refused" Dec 03 12:39:41 crc kubenswrapper[4702]: E1203 12:39:41.244413 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c7ed9f1ef289cc5aacad80a6cda5674b5b2d84db5935b0d3f1326bc5cd93e425" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:41 crc kubenswrapper[4702]: E1203 12:39:41.244482 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerName="galera" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.244634 4702 scope.go:117] "RemoveContainer" containerID="90043828fd23987e4d8ab1daec86921e7428498330c4d66bbe0c66f1386100c2" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.438125 4702 trace.go:236] Trace[350514804]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-ingester-0" (03-Dec-2025 12:39:39.842) (total time: 1594ms): Dec 03 12:39:41 crc kubenswrapper[4702]: Trace[350514804]: [1.594524412s] [1.594524412s] END Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.767920 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bv8pf" Dec 03 12:39:41 crc kubenswrapper[4702]: E1203 12:39:41.786214 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:41 crc kubenswrapper[4702]: E1203 12:39:41.790636 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:41 crc kubenswrapper[4702]: E1203 12:39:41.793636 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:41 crc kubenswrapper[4702]: E1203 12:39:41.793738 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" Dec 03 12:39:41 crc kubenswrapper[4702]: I1203 12:39:41.848154 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.049687 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.050253 4702 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.050300 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.131021 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.137035 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.137113 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.141935 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.254547 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-v6p66" event={"ID":"5bad766d-e524-4670-b353-56e92df2f744","Type":"ContainerStarted","Data":"55bbb22554ab89778547988014875afb18c4cfed31fae66a1c28c1bcfcc0c90f"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.258407 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" event={"ID":"e3e08d4a-20c7-430a-a3d9-988d64e6a6b4","Type":"ContainerStarted","Data":"dd7b9779eeaf9e2adeb100d3a20225857cb6dfb2f99357a9714b43c0ecd69e59"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.263172 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerStarted","Data":"d037a18646648da24b75a71e6eb41a1377d7f3bd430210146438cc7b4e80fe31"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.268324 4702 generic.go:334] "Generic (PLEG): container finished" podID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerID="1626abe3ed15a198a4e566a115de6e6d881278876d92ba929f1591bfe5f92455" exitCode=0 Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.268425 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" event={"ID":"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c","Type":"ContainerDied","Data":"1626abe3ed15a198a4e566a115de6e6d881278876d92ba929f1591bfe5f92455"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.273123 4702 generic.go:334] "Generic (PLEG): container finished" podID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" exitCode=0 Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.273183 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerDied","Data":"57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.276223 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" event={"ID":"44af00fd-b9f6-4e74-ad67-581e4ca7527c","Type":"ContainerStarted","Data":"b85bc593509389c0af557d23f0a1829392f9f91d940037c1fafec4493710e236"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.278985 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" event={"ID":"3de04148-0009-427b-8055-a1c5dadb8274","Type":"ContainerStarted","Data":"70694f13f2aa859b2f579c1bc9009fda82fd5b9b3ab96348baa5a15b27d988a0"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.280407 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.280454 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.280303 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.292639 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" event={"ID":"0276c6fb-ba7a-459f-9610-34a03593669b","Type":"ContainerStarted","Data":"cf088c1942458a7bf7a3e252357a1f87cb313452d4694ae18587dc2c9d631990"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.294749 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.294901 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.295528 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.300130 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-nnrsp_c6edf33c-728d-482f-ad5c-ceb85dae3b75/console-operator/0.log" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.300460 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" event={"ID":"c6edf33c-728d-482f-ad5c-ceb85dae3b75","Type":"ContainerStarted","Data":"31bf85ff7a79c77549f1b9d6e7129e30c704cda976ab083973fdef31ec931cea"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.307605 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" event={"ID":"ae6dac10-29ba-4bb8-8a0c-68a2bad519af","Type":"ContainerStarted","Data":"a280071f58aabf9e1700053b19747b51f5eb348e675b1190e9fee6898517be22"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.323433 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" event={"ID":"fecafd1a-bd80-46ea-8839-dbfc2d364a96","Type":"ContainerStarted","Data":"b078bf021bbed235b16201bbe8822812d4225ee0854e065765083a417e169833"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.339039 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-ckjgv" event={"ID":"c43c86a0-692f-406f-871a-24a14f24ed77","Type":"ContainerStarted","Data":"de0e4d779ebd00ff4ccfaf2a5b308935cf01894a680f2d32829f694d6da52733"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.352228 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" event={"ID":"3895be1f-db04-45b3-bd8c-cf2ab8c2aa43","Type":"ContainerStarted","Data":"d97409f134800a30c0814b0a58d7a1b4afbcd85fdbe61635fd4d8da34340b5f5"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.355834 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.356986 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.357111 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.372444 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"90e9786f-3e0d-4a23-b624-b49a3d386784","Type":"ContainerStarted","Data":"478b3adc3ff302a88c339afebe69dec8bb14c9c0b1ed7b647ac4d009eac2a947"} Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.538392 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Dec 03 12:39:42 crc kubenswrapper[4702]: E1203 12:39:42.726578 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d is running failed: container process not found" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:42 crc kubenswrapper[4702]: E1203 12:39:42.726831 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d is running failed: container process not found" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:42 crc kubenswrapper[4702]: E1203 12:39:42.727271 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d is running failed: container process not found" containerID="57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Dec 03 12:39:42 crc kubenswrapper[4702]: E1203 12:39:42.727366 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 57eed73f6a1602742b53d92bb463e26afa819648023f2c4b19c401aeaeb93e4d is running failed: container process not found" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="7ad9fb7a-5481-4bb8-9a9b-99fda2021704" containerName="prometheus" Dec 03 12:39:42 crc kubenswrapper[4702]: E1203 12:39:42.894918 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10 is running failed: container process not found" containerID="a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:42 crc kubenswrapper[4702]: E1203 12:39:42.896684 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10 is running failed: container process not found" containerID="a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:42 crc kubenswrapper[4702]: I1203 12:39:42.897251 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:42 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:42 crc kubenswrapper[4702]: > Dec 03 12:39:42 crc kubenswrapper[4702]: E1203 12:39:42.897457 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10 is running failed: container process not found" containerID="a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:42 crc kubenswrapper[4702]: E1203 12:39:42.897528 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a00791228ce4ce424da843a1d85119c9bddd3c05a4f70b9dc4fd37074d28ff10 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.111358 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-r6jd6" Dec 03 12:39:43 crc kubenswrapper[4702]: E1203 12:39:43.216009 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d is running failed: container process not found" containerID="fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:43 crc kubenswrapper[4702]: E1203 12:39:43.216561 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d is running failed: container process not found" containerID="fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:43 crc kubenswrapper[4702]: E1203 12:39:43.216943 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d is running failed: container process not found" containerID="fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 12:39:43 crc kubenswrapper[4702]: E1203 12:39:43.216996 4702 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fb463bed56004e399bdf91e3fb92de23dca363e6d64d930042f8f75beb45ec4d is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.396394 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" event={"ID":"530ef793-9485-4c45-86ba-531906f2085a","Type":"ContainerStarted","Data":"fc4e5e26eb79d9dca5cacffc2883f4e399bb4250d4ab8e64ee1ca95c85c7af7d"} Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.397329 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.405286 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" event={"ID":"1d60d4ab-7bac-4fd1-9aad-c07ba1513d41","Type":"ContainerStarted","Data":"28f6c55c5354e72dc1fd306bb20120063ed482fea310e9520bace76f21c492df"} Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.406837 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421211 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421285 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421211 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421338 4702 patch_prober.go:28] interesting pod/route-controller-manager-84cf75c7c5-96cmd container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421353 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": dial tcp 10.217.0.69:8443: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421365 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" podUID="e3e08d4a-20c7-430a-a3d9-988d64e6a6b4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421425 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421434 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421455 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421439 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421886 4702 patch_prober.go:28] interesting pod/controller-manager-548478b8dd-9254p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421907 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" podUID="fecafd1a-bd80-46ea-8839-dbfc2d364a96" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421887 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.421945 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": dial tcp 10.217.0.43:8443: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.422514 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.422566 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.463349 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjmph"] Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.703413 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.703794 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.703865 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/prometheus-metric-storage-0" podUID="90e9786f-3e0d-4a23-b624-b49a3d386784" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.170:9090/-/ready\": dial tcp 10.217.0.170:9090: connect: connection refused" Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.945613 4702 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-fv5c5 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 03 12:39:43 crc kubenswrapper[4702]: I1203 12:39:43.945670 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" podUID="a456460b-47a3-48ef-a98b-4f67709d5939" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.106594 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.123788 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.432289 4702 generic.go:334] "Generic (PLEG): container finished" podID="92266ac3-f0a6-4e68-9e88-9aa2900e1fe3" containerID="c7ed9f1ef289cc5aacad80a6cda5674b5b2d84db5935b0d3f1326bc5cd93e425" exitCode=137 Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.432365 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3","Type":"ContainerDied","Data":"c7ed9f1ef289cc5aacad80a6cda5674b5b2d84db5935b0d3f1326bc5cd93e425"} Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.440834 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" event={"ID":"a49f8d97-9fa5-44b6-bd39-e35d4d70b33c","Type":"ContainerStarted","Data":"9ba039d136979219a8472b1622884caac471247a63bd70dc70e091d37e3a4239"} Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.441012 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.444806 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" event={"ID":"523c06cc-9816-4252-ac00-dc7928dae009","Type":"ContainerStarted","Data":"f227e98598aade75875ded5999d4af117410f2e9e364f8d126b90563344021a2"} Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.445171 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.452232 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" event={"ID":"182ca1cb-9499-4cf7-aeae-c35c7038814c","Type":"ContainerStarted","Data":"71fe0068102c19da8102923772218b09f6e381dc837fd4624e599572d0a209fb"} Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.452516 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.455850 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" event={"ID":"8de75640-5551-4d04-830d-64f0fbb7847a","Type":"ContainerStarted","Data":"6d1dbcc95cff0182c9bd7104128f6b211cd05ea741b13629e8a2bde03476e8c1"} Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.456113 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.458443 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjmph" event={"ID":"c55aad18-42f1-4b14-a5fb-686c7a669d40","Type":"ContainerStarted","Data":"fc94de6e575e171fb0d79127f5fab5326fc0fd933cfd2e6ee12f3ea209fdcbb5"} Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.467047 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" event={"ID":"62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d","Type":"ContainerStarted","Data":"f127d4eb73153506893e01d475ec8b8aa78291af1925e35c8dc47dbd7263c348"} Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.467298 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.470579 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" event={"ID":"8f6320ff-4661-46be-80e1-8d97f09fe789","Type":"ContainerStarted","Data":"87a7eae69dd9e13f801341270a8c72578dffe683cdb48f214cb0ddbf71b3942e"} Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.471341 4702 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fvzsf container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.471405 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" podUID="3895be1f-db04-45b3-bd8c-cf2ab8c2aa43" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.471480 4702 status_manager.go:317] "Container readiness changed for unknown container" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" containerID="cri-o://04dfa9ccbc805c8c141db2c3940506e37b3216be0ec5cac7fc187c84a3ff08a6" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.471584 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.483866 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.606526 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-98mxd" Dec 03 12:39:44 crc kubenswrapper[4702]: I1203 12:39:44.809327 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.203438 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.204353 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.204423 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.204769 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.204785 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.205174 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.205194 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.205439 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.205457 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.205505 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.205616 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.205663 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.205982 4702 patch_prober.go:28] interesting pod/console-operator-58897d9998-nnrsp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.206002 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" podUID="c6edf33c-728d-482f-ad5c-ceb85dae3b75" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/readyz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.214335 4702 patch_prober.go:28] interesting pod/router-default-5444994796-x85q2 container/router namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]backend-http ok Dec 03 12:39:45 crc kubenswrapper[4702]: [+]has-synced ok Dec 03 12:39:45 crc kubenswrapper[4702]: [-]process-running failed: reason withheld Dec 03 12:39:45 crc kubenswrapper[4702]: healthz check failed Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.214405 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-x85q2" podUID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.238635 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fvzsf" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.384349 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.492624 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"7ad9fb7a-5481-4bb8-9a9b-99fda2021704","Type":"ContainerStarted","Data":"5e3717f30c0cf9a5718930758a400bf249ed04cb9cc42453e39b2fe417e7cede"} Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.498711 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" event={"ID":"a456460b-47a3-48ef-a98b-4f67709d5939","Type":"ContainerStarted","Data":"3339843fab33633b752bb22b44b6c74dce8b3a5518d74c91cd42d9e1ca39e53a"} Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.498777 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.502321 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.649798 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.777381 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.910018 4702 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5688675f7c-q6w79 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": dial tcp 10.217.0.48:8081: connect: connection refused" start-of-body= Dec 03 12:39:45 crc kubenswrapper[4702]: I1203 12:39:45.910338 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" podUID="672e4a37-26c7-4378-a524-57fba88aec53" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": dial tcp 10.217.0.48:8081: connect: connection refused" Dec 03 12:39:46 crc kubenswrapper[4702]: I1203 12:39:46.517645 4702 generic.go:334] "Generic (PLEG): container finished" podID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerID="0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998" exitCode=0 Dec 03 12:39:46 crc kubenswrapper[4702]: I1203 12:39:46.517820 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjmph" event={"ID":"c55aad18-42f1-4b14-a5fb-686c7a669d40","Type":"ContainerDied","Data":"0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998"} Dec 03 12:39:46 crc kubenswrapper[4702]: I1203 12:39:46.527410 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ds4ss" event={"ID":"9432a2a8-8932-4734-a69d-8976764f1dab","Type":"ContainerStarted","Data":"095d6292d56e847cb6b7763679a4a663fff7bd1a7a58fe7a7789eb0372dc94c0"} Dec 03 12:39:46 crc kubenswrapper[4702]: I1203 12:39:46.534828 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ntqc" event={"ID":"38c7c63b-db59-4055-aee0-99ea082bd8f7","Type":"ContainerStarted","Data":"eecc1518a781701da9bef3d1b6fa69e588ea8a1a4c289ea9b5d7e139ec1e1dc8"} Dec 03 12:39:46 crc kubenswrapper[4702]: I1203 12:39:46.596984 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9" Dec 03 12:39:47 crc kubenswrapper[4702]: I1203 12:39:47.638308 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-x85q2_2b38e6ac-e12a-4798-b0e9-6321dc926487/router/0.log" Dec 03 12:39:47 crc kubenswrapper[4702]: I1203 12:39:47.638942 4702 generic.go:334] "Generic (PLEG): container finished" podID="2b38e6ac-e12a-4798-b0e9-6321dc926487" containerID="a0c95ae1cc071401df4dbae2a9b4cb0862a47a5d82d5152774e1cd848052af53" exitCode=137 Dec 03 12:39:47 crc kubenswrapper[4702]: I1203 12:39:47.639047 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x85q2" event={"ID":"2b38e6ac-e12a-4798-b0e9-6321dc926487","Type":"ContainerDied","Data":"a0c95ae1cc071401df4dbae2a9b4cb0862a47a5d82d5152774e1cd848052af53"} Dec 03 12:39:47 crc kubenswrapper[4702]: I1203 12:39:47.642871 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6cqt" event={"ID":"0f2dd872-6ac4-4527-9a91-218b1de5ed5e","Type":"ContainerStarted","Data":"6900e3c7b7554c511c538bb23c681933368419cbf8befd3b5b21a116bb89d330"} Dec 03 12:39:47 crc kubenswrapper[4702]: I1203 12:39:47.646578 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"92266ac3-f0a6-4e68-9e88-9aa2900e1fe3","Type":"ContainerStarted","Data":"57600ba67ad100e5b50c50e5134a48d0371cfd41053f757c3999d43f4fa31f77"} Dec 03 12:39:47 crc kubenswrapper[4702]: I1203 12:39:47.653136 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6sd9l" event={"ID":"ea8c3262-d494-4427-8228-df9584c00ca1","Type":"ContainerStarted","Data":"fbd2a59fa3382c66260cced14af1c5f0f713e07d537326ec9352b01dd54ae147"} Dec 03 12:39:47 crc kubenswrapper[4702]: I1203 12:39:47.658318 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qhss9" event={"ID":"480aa817-7d43-4ea8-9099-06bcb431e578","Type":"ContainerStarted","Data":"f5e41c0b4fa5fc3ce5840858e4ff4e8ac2d3b9b461ff91c2cd01d7b9d52c8b52"} Dec 03 12:39:47 crc kubenswrapper[4702]: I1203 12:39:47.731087 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.185888 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-m5trg" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.192562 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-htxmz" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.248437 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-w2vmt" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.290712 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.303188 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.312250 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-548478b8dd-9254p" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.312577 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84cf75c7c5-96cmd" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.385624 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.385713 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.464661 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-lp88c" Dec 03 12:39:48 crc kubenswrapper[4702]: E1203 12:39:48.513030 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.546592 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.580381 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-gqqgw" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.672528 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-x85q2_2b38e6ac-e12a-4798-b0e9-6321dc926487/router/0.log" Dec 03 12:39:48 crc kubenswrapper[4702]: I1203 12:39:48.672688 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x85q2" event={"ID":"2b38e6ac-e12a-4798-b0e9-6321dc926487","Type":"ContainerStarted","Data":"c151a90c17a215b77d88ff83ffa0d2a040f0664ba74f7286567bbec3298397de"} Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.003969 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.004062 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.083723 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"5f108ef4356c88932b0eba73bb53c53d7024bb8c067d721f3a95f847709f1121"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.083978 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" containerID="cri-o://5f108ef4356c88932b0eba73bb53c53d7024bb8c067d721f3a95f847709f1121" gracePeriod=30 Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.084890 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-75b4565ff4-4pl92" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.087714 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-4pkkr" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.089264 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-hpf6t" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.090083 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-c98f8bd-8mv9c" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.101356 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-kg6p7" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.183449 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-vpr8z" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.475130 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.516584 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-7xg4t" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.516846 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-t27c4" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.520647 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-m2bfb" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.566748 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-ntzds" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.581033 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.707555 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-2pcqv" Dec 03 12:39:49 crc kubenswrapper[4702]: I1203 12:39:49.756634 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-vz7gf" Dec 03 12:39:51 crc kubenswrapper[4702]: E1203 12:39:51.742641 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:39:51 crc kubenswrapper[4702]: E1203 12:39:51.788600 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:51 crc kubenswrapper[4702]: E1203 12:39:51.795063 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:51 crc kubenswrapper[4702]: E1203 12:39:51.798011 4702 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Dec 03 12:39:51 crc kubenswrapper[4702]: E1203 12:39:51.798152 4702 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerName="galera" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323075 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323382 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323398 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323408 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323418 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323427 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323466 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-f8bdcbf7f-4tp6n" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323477 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323500 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fv5c5" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323520 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-2npsf" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323532 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.323560 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-nj4tn" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.348359 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.350424 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-xlpkq" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.353589 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bv8pf" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.360089 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.374646 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.412417 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ds4ss" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.892196 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 12:39:52 crc kubenswrapper[4702]: I1203 12:39:52.892614 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 12:39:53 crc kubenswrapper[4702]: I1203 12:39:53.214576 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 12:39:53 crc kubenswrapper[4702]: I1203 12:39:53.214642 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 12:39:53 crc kubenswrapper[4702]: I1203 12:39:53.369484 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:53 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:53 crc kubenswrapper[4702]: > Dec 03 12:39:53 crc kubenswrapper[4702]: I1203 12:39:53.379052 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:53 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:53 crc kubenswrapper[4702]: > Dec 03 12:39:53 crc kubenswrapper[4702]: I1203 12:39:53.402378 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:53 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:53 crc kubenswrapper[4702]: > Dec 03 12:39:53 crc kubenswrapper[4702]: I1203 12:39:53.972682 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:53 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:53 crc kubenswrapper[4702]: > Dec 03 12:39:54 crc kubenswrapper[4702]: I1203 12:39:54.310693 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:39:54 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:39:54 crc kubenswrapper[4702]: > Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.128074 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.128379 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.128223 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.128470 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.174237 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.198631 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-nnrsp" Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.242615 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" containerID="cri-o://ed23f0bb952245c24c0ac2a111b381a15a1284a0fcaa9a57735a4ef6e66d92d0" gracePeriod=15 Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.912823 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.912904 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:39:55 crc kubenswrapper[4702]: I1203 12:39:55.919796 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.167647 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.171161 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.960152 4702 generic.go:334] "Generic (PLEG): container finished" podID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerID="a3ebf64628d67a49aab3d7c8c37af7c6fbdf0e61098265182dbf34743899b6e5" exitCode=137 Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.960537 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerDied","Data":"a3ebf64628d67a49aab3d7c8c37af7c6fbdf0e61098265182dbf34743899b6e5"} Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.960718 4702 scope.go:117] "RemoveContainer" containerID="159ee3d241a321aed89bcfa644d4fed939819784a2a494203acf7b31c0a66347" Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.963870 4702 generic.go:334] "Generic (PLEG): container finished" podID="c91e1dc8-ef80-407f-ac34-4c9ab29026f7" containerID="9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290" exitCode=137 Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.964405 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c91e1dc8-ef80-407f-ac34-4c9ab29026f7","Type":"ContainerDied","Data":"9d8dd452e7ea9b777cedcc2818d1bb8e2ae923333f5e8d79d15da1edf56fb290"} Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.968796 4702 generic.go:334] "Generic (PLEG): container finished" podID="0c75375b-08b0-4a81-adca-de576c8ff268" containerID="ed23f0bb952245c24c0ac2a111b381a15a1284a0fcaa9a57735a4ef6e66d92d0" exitCode=0 Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.969035 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" event={"ID":"0c75375b-08b0-4a81-adca-de576c8ff268","Type":"ContainerDied","Data":"ed23f0bb952245c24c0ac2a111b381a15a1284a0fcaa9a57735a4ef6e66d92d0"} Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.969517 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 12:39:56 crc kubenswrapper[4702]: I1203 12:39:56.975331 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-x85q2" Dec 03 12:39:57 crc kubenswrapper[4702]: I1203 12:39:57.983300 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"c91e1dc8-ef80-407f-ac34-4c9ab29026f7","Type":"ContainerStarted","Data":"fe20d9d96d305ee9dd6f3b14c9d70dba8a0bbd49644786b3ec475d89c9e85f99"} Dec 03 12:39:58 crc kubenswrapper[4702]: I1203 12:39:58.710673 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 12:39:58 crc kubenswrapper[4702]: I1203 12:39:58.718067 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 12:39:59 crc kubenswrapper[4702]: I1203 12:39:59.114951 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": dial tcp 10.217.0.62:6443: connect: connection refused" start-of-body= Dec 03 12:39:59 crc kubenswrapper[4702]: I1203 12:39:59.115022 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": dial tcp 10.217.0.62:6443: connect: connection refused" Dec 03 12:40:00 crc kubenswrapper[4702]: I1203 12:40:00.015888 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjmph" event={"ID":"c55aad18-42f1-4b14-a5fb-686c7a669d40","Type":"ContainerStarted","Data":"e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0"} Dec 03 12:40:00 crc kubenswrapper[4702]: I1203 12:40:00.019208 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" event={"ID":"0c75375b-08b0-4a81-adca-de576c8ff268","Type":"ContainerStarted","Data":"5f96aafdbf386a6726c3edbd5e8e9d7a609ec773631d488d054106ebb3375875"} Dec 03 12:40:00 crc kubenswrapper[4702]: I1203 12:40:00.019532 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 12:40:00 crc kubenswrapper[4702]: I1203 12:40:00.019781 4702 patch_prober.go:28] interesting pod/oauth-openshift-6548f7c795-c9kwd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.62:6443/healthz\": dial tcp 10.217.0.62:6443: connect: connection refused" start-of-body= Dec 03 12:40:00 crc kubenswrapper[4702]: I1203 12:40:00.019825 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" podUID="0c75375b-08b0-4a81-adca-de576c8ff268" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.62:6443/healthz\": dial tcp 10.217.0.62:6443: connect: connection refused" Dec 03 12:40:01 crc kubenswrapper[4702]: I1203 12:40:01.597336 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6548f7c795-c9kwd" Dec 03 12:40:01 crc kubenswrapper[4702]: I1203 12:40:01.619895 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:01 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:01 crc kubenswrapper[4702]: > Dec 03 12:40:01 crc kubenswrapper[4702]: I1203 12:40:01.781409 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 12:40:01 crc kubenswrapper[4702]: I1203 12:40:01.781476 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 12:40:01 crc kubenswrapper[4702]: I1203 12:40:01.820462 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:01 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:01 crc kubenswrapper[4702]: > Dec 03 12:40:02 crc kubenswrapper[4702]: I1203 12:40:02.618380 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:02 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:02 crc kubenswrapper[4702]: > Dec 03 12:40:02 crc kubenswrapper[4702]: E1203 12:40:02.682002 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:40:02 crc kubenswrapper[4702]: E1203 12:40:02.899826 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.745941 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v8x6x"] Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.754686 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.793624 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8x6x"] Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.870155 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-catalog-content\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.870245 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-utilities\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.870409 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wxfm\" (UniqueName: \"kubernetes.io/projected/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-kube-api-access-5wxfm\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.975081 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-catalog-content\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.975151 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-utilities\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.975445 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wxfm\" (UniqueName: \"kubernetes.io/projected/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-kube-api-access-5wxfm\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.975796 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-utilities\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:04 crc kubenswrapper[4702]: I1203 12:40:04.975981 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-catalog-content\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.015278 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wxfm\" (UniqueName: \"kubernetes.io/projected/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-kube-api-access-5wxfm\") pod \"redhat-marketplace-v8x6x\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.082574 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:05 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:05 crc kubenswrapper[4702]: > Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.087471 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.127607 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.127669 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.127615 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.128035 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.128074 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.129143 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"d037a18646648da24b75a71e6eb41a1377d7f3bd430210146438cc7b4e80fe31"} pod="openshift-console/downloads-7954f5f757-hndf6" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.129196 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" containerID="cri-o://d037a18646648da24b75a71e6eb41a1377d7f3bd430210146438cc7b4e80fe31" gracePeriod=2 Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.129854 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.129946 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.163849 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:05 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:05 crc kubenswrapper[4702]: > Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.168788 4702 generic.go:334] "Generic (PLEG): container finished" podID="a35fd719-e341-49b9-b12f-f39f2402868b" containerID="771fa941e4419f217c65a63b8471ce0a7afc720670abfe4560184430bbd27a7f" exitCode=1 Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.168892 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a35fd719-e341-49b9-b12f-f39f2402868b","Type":"ContainerDied","Data":"771fa941e4419f217c65a63b8471ce0a7afc720670abfe4560184430bbd27a7f"} Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.173201 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerStarted","Data":"4dbbd4353a8fb940122df2154e6b5e8a889c0e3c37890c611cd2d9effb933a4e"} Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.176324 4702 generic.go:334] "Generic (PLEG): container finished" podID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerID="7d68b0986741379ab8deb19834da932ca36898d987fd53602fe7466e084f46aa" exitCode=137 Dec 03 12:40:05 crc kubenswrapper[4702]: I1203 12:40:05.176372 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerDied","Data":"7d68b0986741379ab8deb19834da932ca36898d987fd53602fe7466e084f46aa"} Dec 03 12:40:06 crc kubenswrapper[4702]: I1203 12:40:06.948984 4702 patch_prober.go:28] interesting pod/loki-operator-controller-manager-5688675f7c-q6w79 container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.48:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:40:06 crc kubenswrapper[4702]: I1203 12:40:06.949424 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-5688675f7c-q6w79" podUID="672e4a37-26c7-4378-a524-57fba88aec53" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.48:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:40:08 crc kubenswrapper[4702]: I1203 12:40:08.178014 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8x6x"] Dec 03 12:40:08 crc kubenswrapper[4702]: I1203 12:40:08.238745 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-85d7874b49-jvs5t" Dec 03 12:40:08 crc kubenswrapper[4702]: I1203 12:40:08.267110 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-7954f5f757-hndf6_05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8/download-server/2.log" Dec 03 12:40:08 crc kubenswrapper[4702]: I1203 12:40:08.267574 4702 generic.go:334] "Generic (PLEG): container finished" podID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerID="d037a18646648da24b75a71e6eb41a1377d7f3bd430210146438cc7b4e80fe31" exitCode=137 Dec 03 12:40:08 crc kubenswrapper[4702]: I1203 12:40:08.267621 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerDied","Data":"d037a18646648da24b75a71e6eb41a1377d7f3bd430210146438cc7b4e80fe31"} Dec 03 12:40:08 crc kubenswrapper[4702]: I1203 12:40:08.267663 4702 scope.go:117] "RemoveContainer" containerID="ccb878a03690ecae62364ca11922e2d74ca8a555b1b3a95db9a68827c0d60357" Dec 03 12:40:08 crc kubenswrapper[4702]: I1203 12:40:08.434553 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.282685 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8x6x" event={"ID":"9f228995-3447-4e6c-bff8-81d1a1a2f8d2","Type":"ContainerStarted","Data":"0247eb22b45514a6f5d1c3d0f47954337e3d461f862240cb5a6ef38eecb3d08e"} Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.285391 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"a35fd719-e341-49b9-b12f-f39f2402868b","Type":"ContainerDied","Data":"2e2be505b314356dd3c8c6970b5ad97cd73f3cd9ce4af62d2455a0598d70d78e"} Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.285458 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.819977 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e2be505b314356dd3c8c6970b5ad97cd73f3cd9ce4af62d2455a0598d70d78e" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820165 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cflnc\" (UniqueName: \"kubernetes.io/projected/a35fd719-e341-49b9-b12f-f39f2402868b-kube-api-access-cflnc\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820341 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-temporary\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820399 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-config-data\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820500 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820585 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-workdir\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820657 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ca-certs\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820733 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config-secret\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820795 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ssh-key\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.820872 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config\") pod \"a35fd719-e341-49b9-b12f-f39f2402868b\" (UID: \"a35fd719-e341-49b9-b12f-f39f2402868b\") " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.821670 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.825570 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.825988 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-config-data" (OuterVolumeSpecName: "config-data") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.874563 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a35fd719-e341-49b9-b12f-f39f2402868b-kube-api-access-cflnc" (OuterVolumeSpecName: "kube-api-access-cflnc") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "kube-api-access-cflnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.879886 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "test-operator-logs") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.944859 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cflnc\" (UniqueName: \"kubernetes.io/projected/a35fd719-e341-49b9-b12f-f39f2402868b-kube-api-access-cflnc\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.944942 4702 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.944963 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.945004 4702 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Dec 03 12:40:09 crc kubenswrapper[4702]: I1203 12:40:09.945021 4702 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/a35fd719-e341-49b9-b12f-f39f2402868b-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.040487 4702 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.057469 4702 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.076955 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.160415 4702 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.178624 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.199028 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.205537 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a35fd719-e341-49b9-b12f-f39f2402868b" (UID: "a35fd719-e341-49b9-b12f-f39f2402868b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.264684 4702 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.265190 4702 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:10 crc kubenswrapper[4702]: I1203 12:40:10.265291 4702 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a35fd719-e341-49b9-b12f-f39f2402868b-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 12:40:11 crc kubenswrapper[4702]: I1203 12:40:11.746513 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:11 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:11 crc kubenswrapper[4702]: > Dec 03 12:40:11 crc kubenswrapper[4702]: I1203 12:40:11.927138 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:11 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:11 crc kubenswrapper[4702]: > Dec 03 12:40:12 crc kubenswrapper[4702]: I1203 12:40:12.326307 4702 generic.go:334] "Generic (PLEG): container finished" podID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerID="5f108ef4356c88932b0eba73bb53c53d7024bb8c067d721f3a95f847709f1121" exitCode=0 Dec 03 12:40:12 crc kubenswrapper[4702]: I1203 12:40:12.326419 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58148f49-2721-4a0a-a5e0-38a2aa23522b","Type":"ContainerDied","Data":"5f108ef4356c88932b0eba73bb53c53d7024bb8c067d721f3a95f847709f1121"} Dec 03 12:40:12 crc kubenswrapper[4702]: I1203 12:40:12.331331 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-7954f5f757-hndf6_05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8/download-server/2.log" Dec 03 12:40:12 crc kubenswrapper[4702]: I1203 12:40:12.601215 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:12 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:12 crc kubenswrapper[4702]: > Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:13.394008 4702 generic.go:334] "Generic (PLEG): container finished" podID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerID="71f327af9b8095b6439e4115c079ac62dc55b95142d0e849d99bd43cacaf767b" exitCode=0 Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.059802 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:14 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:14 crc kubenswrapper[4702]: > Dec 03 12:40:14 crc kubenswrapper[4702]: E1203 12:40:14.381628 4702 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.445s" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.381678 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8x6x" event={"ID":"9f228995-3447-4e6c-bff8-81d1a1a2f8d2","Type":"ContainerDied","Data":"71f327af9b8095b6439e4115c079ac62dc55b95142d0e849d99bd43cacaf767b"} Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.381796 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 12:40:14 crc kubenswrapper[4702]: E1203 12:40:14.403852 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a35fd719-e341-49b9-b12f-f39f2402868b" containerName="tempest-tests-tempest-tests-runner" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.403919 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35fd719-e341-49b9-b12f-f39f2402868b" containerName="tempest-tests-tempest-tests-runner" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.404691 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="a35fd719-e341-49b9-b12f-f39f2402868b" containerName="tempest-tests-tempest-tests-runner" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.405667 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.405795 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.436446 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-qnhdw" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.457293 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerStarted","Data":"1ec67eef6e0ef7e55b1043c7eb6663f82101584015dffe5312d7f0a3bc2c1de6"} Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.479100 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-7954f5f757-hndf6_05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8/download-server/2.log" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.481469 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerStarted","Data":"cab02510fd51a300c783f5cb23426b9fd4fc5e3eb3485c343821284d6b48d657"} Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.481523 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.481595 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.481625 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.567496 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"73a3c36b-a281-4c22-a827-8aa59d607739\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.567747 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hlzp\" (UniqueName: \"kubernetes.io/projected/73a3c36b-a281-4c22-a827-8aa59d607739-kube-api-access-6hlzp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"73a3c36b-a281-4c22-a827-8aa59d607739\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:14 crc kubenswrapper[4702]: E1203 12:40:14.625046 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f228995_3447_4e6c_bff8_81d1a1a2f8d2.slice/crio-71f327af9b8095b6439e4115c079ac62dc55b95142d0e849d99bd43cacaf767b.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.675088 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"73a3c36b-a281-4c22-a827-8aa59d607739\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.675331 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hlzp\" (UniqueName: \"kubernetes.io/projected/73a3c36b-a281-4c22-a827-8aa59d607739-kube-api-access-6hlzp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"73a3c36b-a281-4c22-a827-8aa59d607739\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.675559 4702 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"73a3c36b-a281-4c22-a827-8aa59d607739\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.717840 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hlzp\" (UniqueName: \"kubernetes.io/projected/73a3c36b-a281-4c22-a827-8aa59d607739-kube-api-access-6hlzp\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"73a3c36b-a281-4c22-a827-8aa59d607739\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:14 crc kubenswrapper[4702]: I1203 12:40:14.830286 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"73a3c36b-a281-4c22-a827-8aa59d607739\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.070117 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.128050 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.128105 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.128050 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.128155 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.370466 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:15 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:15 crc kubenswrapper[4702]: > Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.494680 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.494830 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:15 crc kubenswrapper[4702]: I1203 12:40:15.903924 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" Dec 03 12:40:18 crc kubenswrapper[4702]: E1203 12:40:18.032554 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:40:18 crc kubenswrapper[4702]: I1203 12:40:18.343845 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.648058917s: [/var/lib/containers/storage/overlay/f50488fd8875cffed9558ed49aec1043dda121a623e4e25d3ecab4e1f3d0c526/diff /var/log/pods/openstack_neutron-5fdfb45b77-sfz9f_6889403a-b787-4401-a235-0f8297e5844f/neutron-httpd/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:40:18 crc kubenswrapper[4702]: I1203 12:40:18.346421 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.628435837s: [/var/lib/containers/storage/overlay/9e7aff362a60f9a31167395e82a80bec277390f3eaaa56f6c54ec7862a955edc/diff /var/log/pods/openstack_placement-57f75d96b4-7bvsq_4785be1d-0e87-49ae-b5de-56bbab3b5eff/placement-api/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:40:20 crc kubenswrapper[4702]: I1203 12:40:20.615098 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-psnhp" podUID="b6faaca6-f017-42ac-95e4-d73ae3e8e519" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:40:21 crc kubenswrapper[4702]: I1203 12:40:21.129682 4702 generic.go:334] "Generic (PLEG): container finished" podID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerID="6c914a6b412546c50067dc6351057f63f959bc01730befb56cae49a0c30ca747" exitCode=1 Dec 03 12:40:21 crc kubenswrapper[4702]: I1203 12:40:21.129816 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerDied","Data":"6c914a6b412546c50067dc6351057f63f959bc01730befb56cae49a0c30ca747"} Dec 03 12:40:21 crc kubenswrapper[4702]: I1203 12:40:21.131073 4702 scope.go:117] "RemoveContainer" containerID="6c914a6b412546c50067dc6351057f63f959bc01730befb56cae49a0c30ca747" Dec 03 12:40:21 crc kubenswrapper[4702]: I1203 12:40:21.138316 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8x6x" event={"ID":"9f228995-3447-4e6c-bff8-81d1a1a2f8d2","Type":"ContainerStarted","Data":"1e8cdf0a34ee23d0aaf008ae40fc9a06c5e273d0b5d0579fae7f8e09d29e1242"} Dec 03 12:40:21 crc kubenswrapper[4702]: I1203 12:40:21.624331 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:21 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:21 crc kubenswrapper[4702]: > Dec 03 12:40:21 crc kubenswrapper[4702]: I1203 12:40:21.813462 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:21 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:21 crc kubenswrapper[4702]: > Dec 03 12:40:23 crc kubenswrapper[4702]: I1203 12:40:22.598434 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:23 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:23 crc kubenswrapper[4702]: > Dec 03 12:40:23 crc kubenswrapper[4702]: I1203 12:40:23.131313 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:40:23 crc kubenswrapper[4702]: I1203 12:40:23.131708 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:40:23 crc kubenswrapper[4702]: I1203 12:40:23.131818 4702 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-q4rbl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:40:23 crc kubenswrapper[4702]: I1203 12:40:23.131892 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-q4rbl" podUID="44af00fd-b9f6-4e74-ad67-581e4ca7527c" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.69:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:40:23 crc kubenswrapper[4702]: I1203 12:40:23.714835 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.027390392s: [/var/lib/containers/storage/overlay/b9e6723ada1e13e7b57ee639e2b0a650340e630911d34a988c681179c18b4fdf/diff /var/log/pods/openstack_barbican-keystone-listener-5b9bd8bd96-9mqbl_8600900e-a4f2-484b-8e66-be0b81303777/barbican-keystone-listener/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:40:23 crc kubenswrapper[4702]: I1203 12:40:23.717010 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.700925988s: [/var/lib/containers/storage/overlay/12351d4ecc2a516653a18651af9737b5c62309bf04710909b523cc61ba7e6093/diff /var/log/pods/openstack_barbican-api-76f68f8c78-w8n8j_36f29f89-b01b-4656-ba86-a2f731d0c1e0/barbican-api/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:40:24 crc kubenswrapper[4702]: I1203 12:40:23.946263 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:24 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:24 crc kubenswrapper[4702]: > Dec 03 12:40:24 crc kubenswrapper[4702]: I1203 12:40:24.300049 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:24 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:24 crc kubenswrapper[4702]: > Dec 03 12:40:24 crc kubenswrapper[4702]: W1203 12:40:24.630422 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73a3c36b_a281_4c22_a827_8aa59d607739.slice/crio-a4f9c797ba2fcdc6bc22237f6aa13f4211c77820eeff601d6adc7636cb8480b5 WatchSource:0}: Error finding container a4f9c797ba2fcdc6bc22237f6aa13f4211c77820eeff601d6adc7636cb8480b5: Status 404 returned error can't find the container with id a4f9c797ba2fcdc6bc22237f6aa13f4211c77820eeff601d6adc7636cb8480b5 Dec 03 12:40:24 crc kubenswrapper[4702]: I1203 12:40:24.668084 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 03 12:40:25 crc kubenswrapper[4702]: I1203 12:40:25.001504 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:40:25 crc kubenswrapper[4702]: I1203 12:40:25.130972 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:25 crc kubenswrapper[4702]: I1203 12:40:25.131055 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:25 crc kubenswrapper[4702]: I1203 12:40:25.131778 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:25 crc kubenswrapper[4702]: I1203 12:40:25.131860 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:25 crc kubenswrapper[4702]: W1203 12:40:25.324727 4702 logging.go:55] [core] [Channel #2165 SubChannel #2166]grpc: addrConn.createTransport failed to connect to {Addr: "/var/lib/kubelet/plugins/csi-hostpath/csi.sock", ServerName: "localhost", }. Err: connection error: desc = "transport: Error while dialing: dial unix /var/lib/kubelet/plugins/csi-hostpath/csi.sock: connect: connection refused" Dec 03 12:40:25 crc kubenswrapper[4702]: I1203 12:40:25.369169 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"58148f49-2721-4a0a-a5e0-38a2aa23522b","Type":"ContainerStarted","Data":"c55a6b158bd0883cb0aea112a1af2f3c9bdc113cb975b0506f771446330e7e42"} Dec 03 12:40:25 crc kubenswrapper[4702]: I1203 12:40:25.406076 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"73a3c36b-a281-4c22-a827-8aa59d607739","Type":"ContainerStarted","Data":"a4f9c797ba2fcdc6bc22237f6aa13f4211c77820eeff601d6adc7636cb8480b5"} Dec 03 12:40:25 crc kubenswrapper[4702]: E1203 12:40:25.413503 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:40:25 crc kubenswrapper[4702]: I1203 12:40:25.912044 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:40:26 crc kubenswrapper[4702]: I1203 12:40:25.912624 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:40:26 crc kubenswrapper[4702]: I1203 12:40:26.646409 4702 generic.go:334] "Generic (PLEG): container finished" podID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerID="e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0" exitCode=0 Dec 03 12:40:26 crc kubenswrapper[4702]: I1203 12:40:26.646585 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjmph" event={"ID":"c55aad18-42f1-4b14-a5fb-686c7a669d40","Type":"ContainerDied","Data":"e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0"} Dec 03 12:40:26 crc kubenswrapper[4702]: I1203 12:40:26.664228 4702 generic.go:334] "Generic (PLEG): container finished" podID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerID="1e8cdf0a34ee23d0aaf008ae40fc9a06c5e273d0b5d0579fae7f8e09d29e1242" exitCode=0 Dec 03 12:40:26 crc kubenswrapper[4702]: I1203 12:40:26.664276 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8x6x" event={"ID":"9f228995-3447-4e6c-bff8-81d1a1a2f8d2","Type":"ContainerDied","Data":"1e8cdf0a34ee23d0aaf008ae40fc9a06c5e273d0b5d0579fae7f8e09d29e1242"} Dec 03 12:40:26 crc kubenswrapper[4702]: I1203 12:40:26.996003 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-pscld" podUID="042c7f5b-da64-4f42-a2b2-58d04b73c12a" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:40:27 crc kubenswrapper[4702]: I1203 12:40:27.087190 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:40:27 crc kubenswrapper[4702]: I1203 12:40:27.092437 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-notification-agent" containerID="cri-o://4a4d1d81e28532047aaccc56a98fee9fd389b4a28bda4c3010652cb18c7f17ce" gracePeriod=30 Dec 03 12:40:27 crc kubenswrapper[4702]: I1203 12:40:27.093199 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" containerID="cri-o://4dbbd4353a8fb940122df2154e6b5e8a889c0e3c37890c611cd2d9effb933a4e" gracePeriod=30 Dec 03 12:40:27 crc kubenswrapper[4702]: I1203 12:40:27.093270 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="proxy-httpd" containerID="cri-o://cd72b6653ef26d803cddcb816f7eaefb37cb764a19e5714195ad2c44eba5f98e" gracePeriod=30 Dec 03 12:40:27 crc kubenswrapper[4702]: I1203 12:40:27.093317 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="sg-core" containerID="cri-o://853324dd0987f96213f5bdb78bc213d546ea6d301d937cb9f87f51a91e5a1f2b" gracePeriod=30 Dec 03 12:40:27 crc kubenswrapper[4702]: I1203 12:40:27.725835 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 12:40:27 crc kubenswrapper[4702]: I1203 12:40:27.901607 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 12:40:28 crc kubenswrapper[4702]: I1203 12:40:28.800063 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-pscld" event={"ID":"042c7f5b-da64-4f42-a2b2-58d04b73c12a","Type":"ContainerStarted","Data":"c5b86df47736d1deb290ee41aaa1ebdb6e739d65c167e663309495798a72b254"} Dec 03 12:40:28 crc kubenswrapper[4702]: I1203 12:40:28.830315 4702 generic.go:334] "Generic (PLEG): container finished" podID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerID="853324dd0987f96213f5bdb78bc213d546ea6d301d937cb9f87f51a91e5a1f2b" exitCode=2 Dec 03 12:40:28 crc kubenswrapper[4702]: I1203 12:40:28.830621 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerDied","Data":"853324dd0987f96213f5bdb78bc213d546ea6d301d937cb9f87f51a91e5a1f2b"} Dec 03 12:40:28 crc kubenswrapper[4702]: I1203 12:40:28.995524 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.381587 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.591093 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.591098 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-66c548d864-tr7qq" podUID="2cb93136-1d69-4bc8-9c42-aee1f6638aa6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.93:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.646463 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.846001 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjmph" event={"ID":"c55aad18-42f1-4b14-a5fb-686c7a669d40","Type":"ContainerStarted","Data":"e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2"} Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.874337 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8x6x" event={"ID":"9f228995-3447-4e6c-bff8-81d1a1a2f8d2","Type":"ContainerStarted","Data":"9e43a4ba41285b551f5e9415f44424b8fdfd41673699c177671d63c7c903c2fe"} Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.893305 4702 generic.go:334] "Generic (PLEG): container finished" podID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerID="cd72b6653ef26d803cddcb816f7eaefb37cb764a19e5714195ad2c44eba5f98e" exitCode=0 Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.893378 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerDied","Data":"cd72b6653ef26d803cddcb816f7eaefb37cb764a19e5714195ad2c44eba5f98e"} Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.900220 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"73a3c36b-a281-4c22-a827-8aa59d607739","Type":"ContainerStarted","Data":"29c6ffda31f3b0d192feb18519ae8b82a6be700ff88b54f0e8f274c3f0adcc57"} Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.905093 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjmph" podStartSLOduration=43.843812954 podStartE2EDuration="1m23.905056495s" podCreationTimestamp="2025-12-03 12:39:06 +0000 UTC" firstStartedPulling="2025-12-03 12:39:47.663936747 +0000 UTC m=+5771.499865211" lastFinishedPulling="2025-12-03 12:40:27.725180288 +0000 UTC m=+5811.561108752" observedRunningTime="2025-12-03 12:40:29.875172681 +0000 UTC m=+5813.711101145" watchObservedRunningTime="2025-12-03 12:40:29.905056495 +0000 UTC m=+5813.740984969" Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.923516 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v8x6x" podStartSLOduration=12.680396061 podStartE2EDuration="25.923495251s" podCreationTimestamp="2025-12-03 12:40:04 +0000 UTC" firstStartedPulling="2025-12-03 12:40:14.482094019 +0000 UTC m=+5798.318022483" lastFinishedPulling="2025-12-03 12:40:27.725193209 +0000 UTC m=+5811.561121673" observedRunningTime="2025-12-03 12:40:29.921889955 +0000 UTC m=+5813.757818429" watchObservedRunningTime="2025-12-03 12:40:29.923495251 +0000 UTC m=+5813.759423715" Dec 03 12:40:29 crc kubenswrapper[4702]: I1203 12:40:29.985198 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=12.966723307 podStartE2EDuration="16.985168833s" podCreationTimestamp="2025-12-03 12:40:13 +0000 UTC" firstStartedPulling="2025-12-03 12:40:24.632623666 +0000 UTC m=+5808.468552130" lastFinishedPulling="2025-12-03 12:40:28.651069192 +0000 UTC m=+5812.486997656" observedRunningTime="2025-12-03 12:40:29.955223518 +0000 UTC m=+5813.791151992" watchObservedRunningTime="2025-12-03 12:40:29.985168833 +0000 UTC m=+5813.821097297" Dec 03 12:40:30 crc kubenswrapper[4702]: I1203 12:40:30.914628 4702 generic.go:334] "Generic (PLEG): container finished" podID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerID="4a4d1d81e28532047aaccc56a98fee9fd389b4a28bda4c3010652cb18c7f17ce" exitCode=0 Dec 03 12:40:30 crc kubenswrapper[4702]: I1203 12:40:30.916427 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerDied","Data":"4a4d1d81e28532047aaccc56a98fee9fd389b4a28bda4c3010652cb18c7f17ce"} Dec 03 12:40:32 crc kubenswrapper[4702]: I1203 12:40:32.098206 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:32 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:32 crc kubenswrapper[4702]: > Dec 03 12:40:32 crc kubenswrapper[4702]: I1203 12:40:32.115491 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:32 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:32 crc kubenswrapper[4702]: > Dec 03 12:40:32 crc kubenswrapper[4702]: I1203 12:40:32.603272 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:32 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:32 crc kubenswrapper[4702]: > Dec 03 12:40:32 crc kubenswrapper[4702]: I1203 12:40:32.603406 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:40:32 crc kubenswrapper[4702]: I1203 12:40:32.613235 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc"} pod="openshift-marketplace/certified-operators-sr26m" containerMessage="Container registry-server failed startup probe, will be restarted" Dec 03 12:40:32 crc kubenswrapper[4702]: I1203 12:40:32.613319 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" containerID="cri-o://c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc" gracePeriod=30 Dec 03 12:40:32 crc kubenswrapper[4702]: I1203 12:40:32.698986 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 12:40:33 crc kubenswrapper[4702]: E1203 12:40:33.233467 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:40:33 crc kubenswrapper[4702]: I1203 12:40:33.680105 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 12:40:34 crc kubenswrapper[4702]: I1203 12:40:34.513227 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:34 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:34 crc kubenswrapper[4702]: > Dec 03 12:40:34 crc kubenswrapper[4702]: I1203 12:40:34.572597 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:34 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:34 crc kubenswrapper[4702]: > Dec 03 12:40:34 crc kubenswrapper[4702]: I1203 12:40:34.921355 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.088445 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.089327 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.127839 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.127899 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.127901 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.127952 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.127978 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.128732 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.128845 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.128966 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"cab02510fd51a300c783f5cb23426b9fd4fc5e3eb3485c343821284d6b48d657"} pod="openshift-console/downloads-7954f5f757-hndf6" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 03 12:40:35 crc kubenswrapper[4702]: I1203 12:40:35.129012 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" containerID="cri-o://cab02510fd51a300c783f5cb23426b9fd4fc5e3eb3485c343821284d6b48d657" gracePeriod=2 Dec 03 12:40:36 crc kubenswrapper[4702]: I1203 12:40:36.046510 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-7954f5f757-hndf6_05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8/download-server/2.log" Dec 03 12:40:36 crc kubenswrapper[4702]: I1203 12:40:36.046814 4702 generic.go:334] "Generic (PLEG): container finished" podID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerID="cab02510fd51a300c783f5cb23426b9fd4fc5e3eb3485c343821284d6b48d657" exitCode=0 Dec 03 12:40:36 crc kubenswrapper[4702]: I1203 12:40:36.046847 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerDied","Data":"cab02510fd51a300c783f5cb23426b9fd4fc5e3eb3485c343821284d6b48d657"} Dec 03 12:40:36 crc kubenswrapper[4702]: I1203 12:40:36.046887 4702 scope.go:117] "RemoveContainer" containerID="d037a18646648da24b75a71e6eb41a1377d7f3bd430210146438cc7b4e80fe31" Dec 03 12:40:36 crc kubenswrapper[4702]: E1203 12:40:36.064829 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fc908a_9418_4e6e_ac17_9e725524f9ce.slice/crio-d8c4e0cc738e522968136e6f8f489ad82fb59f039b20f2c64f7e9b0f049cd07c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0726c3_58ef_4a22_8e00_bae32d7d66ca.slice/crio-conmon-57f69158d6a5a92126a1143c649b277d7f674e998114824ae8a3dceb6c7156ca.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:40:36 crc kubenswrapper[4702]: I1203 12:40:36.143019 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-v8x6x" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:36 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:36 crc kubenswrapper[4702]: > Dec 03 12:40:36 crc kubenswrapper[4702]: I1203 12:40:36.821955 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.9:3000/\": dial tcp 10.217.1.9:3000: connect: connection refused" Dec 03 12:40:38 crc kubenswrapper[4702]: I1203 12:40:38.151285 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:40:38 crc kubenswrapper[4702]: I1203 12:40:38.151735 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:40:38 crc kubenswrapper[4702]: I1203 12:40:38.238237 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-hndf6" event={"ID":"05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8","Type":"ContainerStarted","Data":"77801a72c2dc04b7df634a673c1e6a43ad41dc05ae69a4acfe09bcd158d43231"} Dec 03 12:40:38 crc kubenswrapper[4702]: I1203 12:40:38.238540 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 12:40:38 crc kubenswrapper[4702]: I1203 12:40:38.238823 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:38 crc kubenswrapper[4702]: I1203 12:40:38.238876 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:38 crc kubenswrapper[4702]: I1203 12:40:38.549036 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 12:40:39 crc kubenswrapper[4702]: I1203 12:40:39.065330 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 12:40:39 crc kubenswrapper[4702]: I1203 12:40:39.217905 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjmph" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:39 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:39 crc kubenswrapper[4702]: > Dec 03 12:40:39 crc kubenswrapper[4702]: I1203 12:40:39.253688 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:39 crc kubenswrapper[4702]: I1203 12:40:39.254000 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:40 crc kubenswrapper[4702]: I1203 12:40:40.098593 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 12:40:41 crc kubenswrapper[4702]: I1203 12:40:41.615751 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6sd9l" podUID="ea8c3262-d494-4427-8228-df9584c00ca1" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:41 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:41 crc kubenswrapper[4702]: > Dec 03 12:40:41 crc kubenswrapper[4702]: I1203 12:40:41.833942 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:41 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:41 crc kubenswrapper[4702]: > Dec 03 12:40:42 crc kubenswrapper[4702]: I1203 12:40:42.298270 4702 generic.go:334] "Generic (PLEG): container finished" podID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerID="c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc" exitCode=0 Dec 03 12:40:42 crc kubenswrapper[4702]: I1203 12:40:42.298629 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerDied","Data":"c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc"} Dec 03 12:40:43 crc kubenswrapper[4702]: I1203 12:40:43.974461 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:43 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:43 crc kubenswrapper[4702]: > Dec 03 12:40:44 crc kubenswrapper[4702]: I1203 12:40:44.270462 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:44 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:44 crc kubenswrapper[4702]: > Dec 03 12:40:45 crc kubenswrapper[4702]: I1203 12:40:45.128007 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:45 crc kubenswrapper[4702]: I1203 12:40:45.128348 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:45 crc kubenswrapper[4702]: I1203 12:40:45.128042 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:45 crc kubenswrapper[4702]: I1203 12:40:45.128486 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:46 crc kubenswrapper[4702]: I1203 12:40:46.142286 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-v8x6x" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:46 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:46 crc kubenswrapper[4702]: > Dec 03 12:40:48 crc kubenswrapper[4702]: I1203 12:40:48.416808 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerStarted","Data":"11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe"} Dec 03 12:40:49 crc kubenswrapper[4702]: I1203 12:40:49.912119 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjmph" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:49 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:49 crc kubenswrapper[4702]: > Dec 03 12:40:51 crc kubenswrapper[4702]: I1203 12:40:51.415465 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 12:40:51 crc kubenswrapper[4702]: I1203 12:40:51.519628 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6sd9l" Dec 03 12:40:51 crc kubenswrapper[4702]: I1203 12:40:51.542633 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:40:51 crc kubenswrapper[4702]: I1203 12:40:51.542697 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:40:52 crc kubenswrapper[4702]: I1203 12:40:52.400981 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:52 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:52 crc kubenswrapper[4702]: > Dec 03 12:40:52 crc kubenswrapper[4702]: I1203 12:40:52.618115 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:52 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:52 crc kubenswrapper[4702]: > Dec 03 12:40:54 crc kubenswrapper[4702]: I1203 12:40:54.008772 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7ntqc" podUID="38c7c63b-db59-4055-aee0-99ea082bd8f7" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:54 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:54 crc kubenswrapper[4702]: > Dec 03 12:40:54 crc kubenswrapper[4702]: I1203 12:40:54.275825 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-m6cqt" podUID="0f2dd872-6ac4-4527-9a91-218b1de5ed5e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:54 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:54 crc kubenswrapper[4702]: > Dec 03 12:40:55 crc kubenswrapper[4702]: I1203 12:40:55.127854 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:55 crc kubenswrapper[4702]: I1203 12:40:55.127947 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:55 crc kubenswrapper[4702]: I1203 12:40:55.127859 4702 patch_prober.go:28] interesting pod/downloads-7954f5f757-hndf6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Dec 03 12:40:55 crc kubenswrapper[4702]: I1203 12:40:55.128063 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-hndf6" podUID="05159ff7-7caa-4f87-bb0e-8ac8b5ce29d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Dec 03 12:40:56 crc kubenswrapper[4702]: I1203 12:40:55.908912 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:40:56 crc kubenswrapper[4702]: I1203 12:40:55.909358 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:40:56 crc kubenswrapper[4702]: I1203 12:40:55.909468 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:40:56 crc kubenswrapper[4702]: I1203 12:40:55.911040 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"030d6a62e307050ecc85543445f98f0ffef93a52b32474be21649714e5532e7a"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:40:56 crc kubenswrapper[4702]: I1203 12:40:55.911616 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://030d6a62e307050ecc85543445f98f0ffef93a52b32474be21649714e5532e7a" gracePeriod=600 Dec 03 12:40:56 crc kubenswrapper[4702]: I1203 12:40:56.267465 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-v8x6x" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:56 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:56 crc kubenswrapper[4702]: > Dec 03 12:40:57 crc kubenswrapper[4702]: I1203 12:40:57.535871 4702 generic.go:334] "Generic (PLEG): container finished" podID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerID="4dbbd4353a8fb940122df2154e6b5e8a889c0e3c37890c611cd2d9effb933a4e" exitCode=137 Dec 03 12:40:57 crc kubenswrapper[4702]: I1203 12:40:57.536062 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerDied","Data":"4dbbd4353a8fb940122df2154e6b5e8a889c0e3c37890c611cd2d9effb933a4e"} Dec 03 12:40:57 crc kubenswrapper[4702]: I1203 12:40:57.536453 4702 scope.go:117] "RemoveContainer" containerID="a3ebf64628d67a49aab3d7c8c37af7c6fbdf0e61098265182dbf34743899b6e5" Dec 03 12:40:59 crc kubenswrapper[4702]: I1203 12:40:59.212070 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjmph" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" probeResult="failure" output=< Dec 03 12:40:59 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:40:59 crc kubenswrapper[4702]: > Dec 03 12:40:59 crc kubenswrapper[4702]: I1203 12:40:59.405034 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="58148f49-2721-4a0a-a5e0-38a2aa23522b" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.213:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.110480 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="030d6a62e307050ecc85543445f98f0ffef93a52b32474be21649714e5532e7a" exitCode=0 Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.110569 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"030d6a62e307050ecc85543445f98f0ffef93a52b32474be21649714e5532e7a"} Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.111057 4702 scope.go:117] "RemoveContainer" containerID="27576eb49ec5c0c4a9232ba71ea30f8f8e7e027b658b97ea56624b39008b8c42" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.131947 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.256539 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-config-data\") pod \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.256638 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-run-httpd\") pod \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.256665 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-combined-ca-bundle\") pod \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.256703 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-sg-core-conf-yaml\") pod \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.256847 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-log-httpd\") pod \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.257019 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-ceilometer-tls-certs\") pod \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.257126 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl745\" (UniqueName: \"kubernetes.io/projected/4dd6e551-e7d9-4f55-a878-bd36db9707e8-kube-api-access-zl745\") pod \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.257215 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-scripts\") pod \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\" (UID: \"4dd6e551-e7d9-4f55-a878-bd36db9707e8\") " Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.258801 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4dd6e551-e7d9-4f55-a878-bd36db9707e8" (UID: "4dd6e551-e7d9-4f55-a878-bd36db9707e8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.259198 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4dd6e551-e7d9-4f55-a878-bd36db9707e8" (UID: "4dd6e551-e7d9-4f55-a878-bd36db9707e8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.259944 4702 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.259968 4702 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4dd6e551-e7d9-4f55-a878-bd36db9707e8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.276814 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd6e551-e7d9-4f55-a878-bd36db9707e8-kube-api-access-zl745" (OuterVolumeSpecName: "kube-api-access-zl745") pod "4dd6e551-e7d9-4f55-a878-bd36db9707e8" (UID: "4dd6e551-e7d9-4f55-a878-bd36db9707e8"). InnerVolumeSpecName "kube-api-access-zl745". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.282637 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-scripts" (OuterVolumeSpecName: "scripts") pod "4dd6e551-e7d9-4f55-a878-bd36db9707e8" (UID: "4dd6e551-e7d9-4f55-a878-bd36db9707e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.319120 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4dd6e551-e7d9-4f55-a878-bd36db9707e8" (UID: "4dd6e551-e7d9-4f55-a878-bd36db9707e8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.364388 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl745\" (UniqueName: \"kubernetes.io/projected/4dd6e551-e7d9-4f55-a878-bd36db9707e8-kube-api-access-zl745\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.364436 4702 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.364451 4702 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.381819 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4dd6e551-e7d9-4f55-a878-bd36db9707e8" (UID: "4dd6e551-e7d9-4f55-a878-bd36db9707e8"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.406049 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4dd6e551-e7d9-4f55-a878-bd36db9707e8" (UID: "4dd6e551-e7d9-4f55-a878-bd36db9707e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.467245 4702 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.467283 4702 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.470448 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-config-data" (OuterVolumeSpecName: "config-data") pod "4dd6e551-e7d9-4f55-a878-bd36db9707e8" (UID: "4dd6e551-e7d9-4f55-a878-bd36db9707e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.570050 4702 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4dd6e551-e7d9-4f55-a878-bd36db9707e8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:01 crc kubenswrapper[4702]: I1203 12:41:01.826855 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:41:01 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:41:01 crc kubenswrapper[4702]: > Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.130363 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4dd6e551-e7d9-4f55-a878-bd36db9707e8","Type":"ContainerDied","Data":"ca04d485f2e2186c6531a3d7e6ae9986c38aa52af476ee089a6013a9a803ddd3"} Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.130749 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.220424 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.516365 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.542939 4702 scope.go:117] "RemoveContainer" containerID="4dbbd4353a8fb940122df2154e6b5e8a889c0e3c37890c611cd2d9effb933a4e" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.592929 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:41:02 crc kubenswrapper[4702]: E1203 12:41:02.598918 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="sg-core" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.598968 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="sg-core" Dec 03 12:41:02 crc kubenswrapper[4702]: E1203 12:41:02.599033 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="proxy-httpd" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599044 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="proxy-httpd" Dec 03 12:41:02 crc kubenswrapper[4702]: E1203 12:41:02.599069 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599078 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: E1203 12:41:02.599110 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-notification-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599118 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-notification-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: E1203 12:41:02.599157 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599165 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: E1203 12:41:02.599190 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599197 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599548 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599555 4702 scope.go:117] "RemoveContainer" containerID="cd72b6653ef26d803cddcb816f7eaefb37cb764a19e5714195ad2c44eba5f98e" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599567 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="sg-core" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599712 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599936 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="proxy-httpd" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.599981 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-notification-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.600636 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" containerName="ceilometer-central-agent" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.606009 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.606344 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" probeResult="failure" output=< Dec 03 12:41:02 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:41:02 crc kubenswrapper[4702]: > Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.638626 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.640922 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.651138 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.655225 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.681526 4702 scope.go:117] "RemoveContainer" containerID="853324dd0987f96213f5bdb78bc213d546ea6d301d937cb9f87f51a91e5a1f2b" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.726280 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmxk\" (UniqueName: \"kubernetes.io/projected/718cbf0b-d35e-4261-9cec-2ad3bc742998-kube-api-access-kdmxk\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.726336 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.726372 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718cbf0b-d35e-4261-9cec-2ad3bc742998-log-httpd\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.726445 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718cbf0b-d35e-4261-9cec-2ad3bc742998-run-httpd\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.726508 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-scripts\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.726553 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-config-data\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.726586 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.726655 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.731260 4702 scope.go:117] "RemoveContainer" containerID="4a4d1d81e28532047aaccc56a98fee9fd389b4a28bda4c3010652cb18c7f17ce" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.828604 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.828769 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdmxk\" (UniqueName: \"kubernetes.io/projected/718cbf0b-d35e-4261-9cec-2ad3bc742998-kube-api-access-kdmxk\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.828808 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.828846 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718cbf0b-d35e-4261-9cec-2ad3bc742998-log-httpd\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.828920 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718cbf0b-d35e-4261-9cec-2ad3bc742998-run-httpd\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.828989 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-scripts\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.829057 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-config-data\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.829105 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.834377 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718cbf0b-d35e-4261-9cec-2ad3bc742998-log-httpd\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.838719 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/718cbf0b-d35e-4261-9cec-2ad3bc742998-run-httpd\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.840416 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.840642 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-scripts\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.850592 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.850747 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.851551 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/718cbf0b-d35e-4261-9cec-2ad3bc742998-config-data\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.857822 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdmxk\" (UniqueName: \"kubernetes.io/projected/718cbf0b-d35e-4261-9cec-2ad3bc742998-kube-api-access-kdmxk\") pod \"ceilometer-0\" (UID: \"718cbf0b-d35e-4261-9cec-2ad3bc742998\") " pod="openstack/ceilometer-0" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.950094 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd6e551-e7d9-4f55-a878-bd36db9707e8" path="/var/lib/kubelet/pods/4dd6e551-e7d9-4f55-a878-bd36db9707e8/volumes" Dec 03 12:41:02 crc kubenswrapper[4702]: I1203 12:41:02.984489 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 12:41:03 crc kubenswrapper[4702]: I1203 12:41:03.683747 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 12:41:03 crc kubenswrapper[4702]: I1203 12:41:03.683891 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 12:41:03 crc kubenswrapper[4702]: I1203 12:41:03.683916 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a"} Dec 03 12:41:03 crc kubenswrapper[4702]: I1203 12:41:03.815325 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6cqt" Dec 03 12:41:03 crc kubenswrapper[4702]: I1203 12:41:03.823718 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7ntqc" Dec 03 12:41:05 crc kubenswrapper[4702]: I1203 12:41:05.163055 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-hndf6" Dec 03 12:41:05 crc kubenswrapper[4702]: I1203 12:41:05.182470 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 12:41:05 crc kubenswrapper[4702]: I1203 12:41:05.205322 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:41:05 crc kubenswrapper[4702]: I1203 12:41:05.299853 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718cbf0b-d35e-4261-9cec-2ad3bc742998","Type":"ContainerStarted","Data":"8912d790ed9d0bf88bcc7d12c764630dd91062048d05585b71a564fee5b54be5"} Dec 03 12:41:05 crc kubenswrapper[4702]: I1203 12:41:05.300071 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:41:06 crc kubenswrapper[4702]: I1203 12:41:06.336384 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718cbf0b-d35e-4261-9cec-2ad3bc742998","Type":"ContainerStarted","Data":"5a7aa63696214eb9b35b4c545ca1db2104678daf3ac38a3b47114dc1c4988730"} Dec 03 12:41:08 crc kubenswrapper[4702]: I1203 12:41:08.359174 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718cbf0b-d35e-4261-9cec-2ad3bc742998","Type":"ContainerStarted","Data":"ca123f2667d6fa2a9241bef73cf8f1a1048057deef5b953740d628e0e798bc37"} Dec 03 12:41:09 crc kubenswrapper[4702]: I1203 12:41:09.219291 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjmph" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" probeResult="failure" output=< Dec 03 12:41:09 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:41:09 crc kubenswrapper[4702]: > Dec 03 12:41:09 crc kubenswrapper[4702]: I1203 12:41:09.373958 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718cbf0b-d35e-4261-9cec-2ad3bc742998","Type":"ContainerStarted","Data":"a48a0a8902e4eca019685cb2037e38cb49f570b4dcec9b2b4ccdf7d182f331ad"} Dec 03 12:41:11 crc kubenswrapper[4702]: I1203 12:41:11.419929 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"718cbf0b-d35e-4261-9cec-2ad3bc742998","Type":"ContainerStarted","Data":"986b186ef04c019164f9810aa3bb2e3a5f2dcdf7210b12575af487f23246c0c9"} Dec 03 12:41:11 crc kubenswrapper[4702]: I1203 12:41:11.420579 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 12:41:11 crc kubenswrapper[4702]: I1203 12:41:11.442581 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.323727903 podStartE2EDuration="9.442555186s" podCreationTimestamp="2025-12-03 12:41:02 +0000 UTC" firstStartedPulling="2025-12-03 12:41:05.105438658 +0000 UTC m=+5848.941367112" lastFinishedPulling="2025-12-03 12:41:10.224265931 +0000 UTC m=+5854.060194395" observedRunningTime="2025-12-03 12:41:11.437863402 +0000 UTC m=+5855.273791886" watchObservedRunningTime="2025-12-03 12:41:11.442555186 +0000 UTC m=+5855.278483660" Dec 03 12:41:11 crc kubenswrapper[4702]: I1203 12:41:11.611526 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:41:11 crc kubenswrapper[4702]: I1203 12:41:11.666109 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:41:11 crc kubenswrapper[4702]: I1203 12:41:11.822438 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qhss9" podUID="480aa817-7d43-4ea8-9099-06bcb431e578" containerName="registry-server" probeResult="failure" output=< Dec 03 12:41:11 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:41:11 crc kubenswrapper[4702]: > Dec 03 12:41:12 crc kubenswrapper[4702]: I1203 12:41:12.677375 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8x6x"] Dec 03 12:41:12 crc kubenswrapper[4702]: I1203 12:41:12.678939 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v8x6x" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="registry-server" containerID="cri-o://9e43a4ba41285b551f5e9415f44424b8fdfd41673699c177671d63c7c903c2fe" gracePeriod=2 Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.461512 4702 generic.go:334] "Generic (PLEG): container finished" podID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerID="9e43a4ba41285b551f5e9415f44424b8fdfd41673699c177671d63c7c903c2fe" exitCode=0 Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.461615 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8x6x" event={"ID":"9f228995-3447-4e6c-bff8-81d1a1a2f8d2","Type":"ContainerDied","Data":"9e43a4ba41285b551f5e9415f44424b8fdfd41673699c177671d63c7c903c2fe"} Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.525636 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pbxps/must-gather-64pl7"] Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.527924 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.531187 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pbxps"/"kube-root-ca.crt" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.532085 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pbxps"/"openshift-service-ca.crt" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.536993 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pbxps"/"default-dockercfg-h8wmx" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.556380 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pbxps/must-gather-64pl7"] Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.655995 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a8ef37d-7f53-449e-b954-d1624312e255-must-gather-output\") pod \"must-gather-64pl7\" (UID: \"2a8ef37d-7f53-449e-b954-d1624312e255\") " pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.656071 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlf79\" (UniqueName: \"kubernetes.io/projected/2a8ef37d-7f53-449e-b954-d1624312e255-kube-api-access-qlf79\") pod \"must-gather-64pl7\" (UID: \"2a8ef37d-7f53-449e-b954-d1624312e255\") " pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.760530 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a8ef37d-7f53-449e-b954-d1624312e255-must-gather-output\") pod \"must-gather-64pl7\" (UID: \"2a8ef37d-7f53-449e-b954-d1624312e255\") " pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.760675 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlf79\" (UniqueName: \"kubernetes.io/projected/2a8ef37d-7f53-449e-b954-d1624312e255-kube-api-access-qlf79\") pod \"must-gather-64pl7\" (UID: \"2a8ef37d-7f53-449e-b954-d1624312e255\") " pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.761220 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a8ef37d-7f53-449e-b954-d1624312e255-must-gather-output\") pod \"must-gather-64pl7\" (UID: \"2a8ef37d-7f53-449e-b954-d1624312e255\") " pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.816576 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlf79\" (UniqueName: \"kubernetes.io/projected/2a8ef37d-7f53-449e-b954-d1624312e255-kube-api-access-qlf79\") pod \"must-gather-64pl7\" (UID: \"2a8ef37d-7f53-449e-b954-d1624312e255\") " pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:41:13 crc kubenswrapper[4702]: I1203 12:41:13.856450 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.073782 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.172107 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-utilities\") pod \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.172238 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-catalog-content\") pod \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.172380 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wxfm\" (UniqueName: \"kubernetes.io/projected/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-kube-api-access-5wxfm\") pod \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\" (UID: \"9f228995-3447-4e6c-bff8-81d1a1a2f8d2\") " Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.174438 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-utilities" (OuterVolumeSpecName: "utilities") pod "9f228995-3447-4e6c-bff8-81d1a1a2f8d2" (UID: "9f228995-3447-4e6c-bff8-81d1a1a2f8d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.186581 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-kube-api-access-5wxfm" (OuterVolumeSpecName: "kube-api-access-5wxfm") pod "9f228995-3447-4e6c-bff8-81d1a1a2f8d2" (UID: "9f228995-3447-4e6c-bff8-81d1a1a2f8d2"). InnerVolumeSpecName "kube-api-access-5wxfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.196941 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f228995-3447-4e6c-bff8-81d1a1a2f8d2" (UID: "9f228995-3447-4e6c-bff8-81d1a1a2f8d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.274162 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.274576 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wxfm\" (UniqueName: \"kubernetes.io/projected/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-kube-api-access-5wxfm\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.274588 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f228995-3447-4e6c-bff8-81d1a1a2f8d2-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.476574 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v8x6x" event={"ID":"9f228995-3447-4e6c-bff8-81d1a1a2f8d2","Type":"ContainerDied","Data":"0247eb22b45514a6f5d1c3d0f47954337e3d461f862240cb5a6ef38eecb3d08e"} Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.476645 4702 scope.go:117] "RemoveContainer" containerID="9e43a4ba41285b551f5e9415f44424b8fdfd41673699c177671d63c7c903c2fe" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.476735 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v8x6x" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.506349 4702 scope.go:117] "RemoveContainer" containerID="1e8cdf0a34ee23d0aaf008ae40fc9a06c5e273d0b5d0579fae7f8e09d29e1242" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.536401 4702 scope.go:117] "RemoveContainer" containerID="71f327af9b8095b6439e4115c079ac62dc55b95142d0e849d99bd43cacaf767b" Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.552917 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8x6x"] Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.574701 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v8x6x"] Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.630187 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pbxps/must-gather-64pl7"] Dec 03 12:41:14 crc kubenswrapper[4702]: W1203 12:41:14.634586 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a8ef37d_7f53_449e_b954_d1624312e255.slice/crio-7704683d0d576795ff6952b7c1f35d587d01b49468eba253659824e2b998a0a2 WatchSource:0}: Error finding container 7704683d0d576795ff6952b7c1f35d587d01b49468eba253659824e2b998a0a2: Status 404 returned error can't find the container with id 7704683d0d576795ff6952b7c1f35d587d01b49468eba253659824e2b998a0a2 Dec 03 12:41:14 crc kubenswrapper[4702]: I1203 12:41:14.948354 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" path="/var/lib/kubelet/pods/9f228995-3447-4e6c-bff8-81d1a1a2f8d2/volumes" Dec 03 12:41:15 crc kubenswrapper[4702]: I1203 12:41:15.508328 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/must-gather-64pl7" event={"ID":"2a8ef37d-7f53-449e-b954-d1624312e255","Type":"ContainerStarted","Data":"7704683d0d576795ff6952b7c1f35d587d01b49468eba253659824e2b998a0a2"} Dec 03 12:41:17 crc kubenswrapper[4702]: I1203 12:41:17.476121 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sr26m"] Dec 03 12:41:17 crc kubenswrapper[4702]: I1203 12:41:17.476407 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sr26m" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" containerID="cri-o://11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe" gracePeriod=2 Dec 03 12:41:17 crc kubenswrapper[4702]: E1203 12:41:17.952665 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd370ba60_2ec7_4904_8ada_85984ae3582b.slice/crio-11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd370ba60_2ec7_4904_8ada_85984ae3582b.slice/crio-conmon-11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.236492 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.314171 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-utilities\") pod \"d370ba60-2ec7-4904-8ada-85984ae3582b\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.314239 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-catalog-content\") pod \"d370ba60-2ec7-4904-8ada-85984ae3582b\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.314417 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x647v\" (UniqueName: \"kubernetes.io/projected/d370ba60-2ec7-4904-8ada-85984ae3582b-kube-api-access-x647v\") pod \"d370ba60-2ec7-4904-8ada-85984ae3582b\" (UID: \"d370ba60-2ec7-4904-8ada-85984ae3582b\") " Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.317804 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-utilities" (OuterVolumeSpecName: "utilities") pod "d370ba60-2ec7-4904-8ada-85984ae3582b" (UID: "d370ba60-2ec7-4904-8ada-85984ae3582b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.375647 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d370ba60-2ec7-4904-8ada-85984ae3582b" (UID: "d370ba60-2ec7-4904-8ada-85984ae3582b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.422169 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.422229 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d370ba60-2ec7-4904-8ada-85984ae3582b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.579156 4702 generic.go:334] "Generic (PLEG): container finished" podID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerID="11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe" exitCode=0 Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.579216 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerDied","Data":"11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe"} Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.579254 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sr26m" event={"ID":"d370ba60-2ec7-4904-8ada-85984ae3582b","Type":"ContainerDied","Data":"9247a3e31e472287bbd102012315b1071a27157bee2e9e3e0467f29f345f6e0c"} Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.579274 4702 scope.go:117] "RemoveContainer" containerID="11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.579429 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sr26m" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.638060 4702 scope.go:117] "RemoveContainer" containerID="c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc" Dec 03 12:41:18 crc kubenswrapper[4702]: I1203 12:41:18.669330 4702 scope.go:117] "RemoveContainer" containerID="4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.032523 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d370ba60-2ec7-4904-8ada-85984ae3582b-kube-api-access-x647v" (OuterVolumeSpecName: "kube-api-access-x647v") pod "d370ba60-2ec7-4904-8ada-85984ae3582b" (UID: "d370ba60-2ec7-4904-8ada-85984ae3582b"). InnerVolumeSpecName "kube-api-access-x647v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.044457 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x647v\" (UniqueName: \"kubernetes.io/projected/d370ba60-2ec7-4904-8ada-85984ae3582b-kube-api-access-x647v\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.062869 4702 scope.go:117] "RemoveContainer" containerID="450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.296690 4702 scope.go:117] "RemoveContainer" containerID="11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe" Dec 03 12:41:19 crc kubenswrapper[4702]: E1203 12:41:19.298578 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe\": container with ID starting with 11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe not found: ID does not exist" containerID="11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.298768 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe"} err="failed to get container status \"11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe\": rpc error: code = NotFound desc = could not find container \"11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe\": container with ID starting with 11532436354a2b9b6c867e574a221d30d4f20c4a3384db374a8e359680c91efe not found: ID does not exist" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.298832 4702 scope.go:117] "RemoveContainer" containerID="c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc" Dec 03 12:41:19 crc kubenswrapper[4702]: E1203 12:41:19.299397 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc\": container with ID starting with c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc not found: ID does not exist" containerID="c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.299439 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc"} err="failed to get container status \"c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc\": rpc error: code = NotFound desc = could not find container \"c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc\": container with ID starting with c44f24cacdf478d774a767962cbaccdf9f7ecb6b089fa137a7c0fe927ed21fdc not found: ID does not exist" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.299471 4702 scope.go:117] "RemoveContainer" containerID="4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc" Dec 03 12:41:19 crc kubenswrapper[4702]: E1203 12:41:19.300150 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc\": container with ID starting with 4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc not found: ID does not exist" containerID="4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.300182 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc"} err="failed to get container status \"4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc\": rpc error: code = NotFound desc = could not find container \"4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc\": container with ID starting with 4b177b1d9aaf1d336254d501c6bf1c5a5fd509009f468e5a20f3511138cf6bcc not found: ID does not exist" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.300206 4702 scope.go:117] "RemoveContainer" containerID="450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca" Dec 03 12:41:19 crc kubenswrapper[4702]: E1203 12:41:19.300466 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca\": container with ID starting with 450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca not found: ID does not exist" containerID="450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.300487 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca"} err="failed to get container status \"450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca\": rpc error: code = NotFound desc = could not find container \"450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca\": container with ID starting with 450f674aa645e7c8bf68761b64dc940887a71bbde7f961b75ee11f18a0159eca not found: ID does not exist" Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.322054 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sr26m"] Dec 03 12:41:19 crc kubenswrapper[4702]: I1203 12:41:19.346299 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sr26m"] Dec 03 12:41:20 crc kubenswrapper[4702]: I1203 12:41:20.197261 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjmph" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" probeResult="failure" output=< Dec 03 12:41:20 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:41:20 crc kubenswrapper[4702]: > Dec 03 12:41:20 crc kubenswrapper[4702]: I1203 12:41:20.944399 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" path="/var/lib/kubelet/pods/d370ba60-2ec7-4904-8ada-85984ae3582b/volumes" Dec 03 12:41:21 crc kubenswrapper[4702]: I1203 12:41:21.755005 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 12:41:21 crc kubenswrapper[4702]: I1203 12:41:21.831874 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qhss9" Dec 03 12:41:28 crc kubenswrapper[4702]: I1203 12:41:28.244108 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:41:28 crc kubenswrapper[4702]: I1203 12:41:28.321490 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:41:28 crc kubenswrapper[4702]: I1203 12:41:28.925236 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/must-gather-64pl7" event={"ID":"2a8ef37d-7f53-449e-b954-d1624312e255","Type":"ContainerStarted","Data":"6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4"} Dec 03 12:41:28 crc kubenswrapper[4702]: I1203 12:41:28.925608 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/must-gather-64pl7" event={"ID":"2a8ef37d-7f53-449e-b954-d1624312e255","Type":"ContainerStarted","Data":"0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e"} Dec 03 12:41:28 crc kubenswrapper[4702]: I1203 12:41:28.959619 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pbxps/must-gather-64pl7" podStartSLOduration=2.626974483 podStartE2EDuration="15.959597879s" podCreationTimestamp="2025-12-03 12:41:13 +0000 UTC" firstStartedPulling="2025-12-03 12:41:14.637698359 +0000 UTC m=+5858.473626823" lastFinishedPulling="2025-12-03 12:41:27.970321765 +0000 UTC m=+5871.806250219" observedRunningTime="2025-12-03 12:41:28.953112614 +0000 UTC m=+5872.789041078" watchObservedRunningTime="2025-12-03 12:41:28.959597879 +0000 UTC m=+5872.795526343" Dec 03 12:41:29 crc kubenswrapper[4702]: I1203 12:41:29.678116 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjmph"] Dec 03 12:41:29 crc kubenswrapper[4702]: I1203 12:41:29.935244 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjmph" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" containerID="cri-o://e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2" gracePeriod=2 Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.862976 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.932410 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-catalog-content\") pod \"c55aad18-42f1-4b14-a5fb-686c7a669d40\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.932609 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-utilities\") pod \"c55aad18-42f1-4b14-a5fb-686c7a669d40\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.932713 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl8kp\" (UniqueName: \"kubernetes.io/projected/c55aad18-42f1-4b14-a5fb-686c7a669d40-kube-api-access-zl8kp\") pod \"c55aad18-42f1-4b14-a5fb-686c7a669d40\" (UID: \"c55aad18-42f1-4b14-a5fb-686c7a669d40\") " Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.933363 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-utilities" (OuterVolumeSpecName: "utilities") pod "c55aad18-42f1-4b14-a5fb-686c7a669d40" (UID: "c55aad18-42f1-4b14-a5fb-686c7a669d40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.942275 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c55aad18-42f1-4b14-a5fb-686c7a669d40-kube-api-access-zl8kp" (OuterVolumeSpecName: "kube-api-access-zl8kp") pod "c55aad18-42f1-4b14-a5fb-686c7a669d40" (UID: "c55aad18-42f1-4b14-a5fb-686c7a669d40"). InnerVolumeSpecName "kube-api-access-zl8kp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.963197 4702 generic.go:334] "Generic (PLEG): container finished" podID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerID="e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2" exitCode=0 Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.963305 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjmph" Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.999005 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjmph" event={"ID":"c55aad18-42f1-4b14-a5fb-686c7a669d40","Type":"ContainerDied","Data":"e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2"} Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.999069 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjmph" event={"ID":"c55aad18-42f1-4b14-a5fb-686c7a669d40","Type":"ContainerDied","Data":"fc94de6e575e171fb0d79127f5fab5326fc0fd933cfd2e6ee12f3ea209fdcbb5"} Dec 03 12:41:30 crc kubenswrapper[4702]: I1203 12:41:30.999092 4702 scope.go:117] "RemoveContainer" containerID="e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.037010 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.037175 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl8kp\" (UniqueName: \"kubernetes.io/projected/c55aad18-42f1-4b14-a5fb-686c7a669d40-kube-api-access-zl8kp\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.042208 4702 scope.go:117] "RemoveContainer" containerID="e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.073434 4702 scope.go:117] "RemoveContainer" containerID="0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.083954 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c55aad18-42f1-4b14-a5fb-686c7a669d40" (UID: "c55aad18-42f1-4b14-a5fb-686c7a669d40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.133233 4702 scope.go:117] "RemoveContainer" containerID="e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2" Dec 03 12:41:31 crc kubenswrapper[4702]: E1203 12:41:31.136205 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2\": container with ID starting with e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2 not found: ID does not exist" containerID="e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.136256 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2"} err="failed to get container status \"e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2\": rpc error: code = NotFound desc = could not find container \"e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2\": container with ID starting with e494f32aaca75b27c83a3b344fad16cfc216b50b669ca109c6ae72f4f8a919d2 not found: ID does not exist" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.136972 4702 scope.go:117] "RemoveContainer" containerID="e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0" Dec 03 12:41:31 crc kubenswrapper[4702]: E1203 12:41:31.137422 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0\": container with ID starting with e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0 not found: ID does not exist" containerID="e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.137454 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0"} err="failed to get container status \"e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0\": rpc error: code = NotFound desc = could not find container \"e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0\": container with ID starting with e1c884863c69adffab91e04bd0b0bc6496ff63c33e8430a6407ecb1e716839e0 not found: ID does not exist" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.137472 4702 scope.go:117] "RemoveContainer" containerID="0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998" Dec 03 12:41:31 crc kubenswrapper[4702]: E1203 12:41:31.139560 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998\": container with ID starting with 0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998 not found: ID does not exist" containerID="0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.139606 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998"} err="failed to get container status \"0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998\": rpc error: code = NotFound desc = could not find container \"0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998\": container with ID starting with 0ec081121d39447a5f7fd14b4f4cb1e337db237510efd4b77a50cd26be2bb998 not found: ID does not exist" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.140257 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c55aad18-42f1-4b14-a5fb-686c7a669d40-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.307484 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjmph"] Dec 03 12:41:31 crc kubenswrapper[4702]: I1203 12:41:31.320337 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjmph"] Dec 03 12:41:32 crc kubenswrapper[4702]: I1203 12:41:32.944491 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" path="/var/lib/kubelet/pods/c55aad18-42f1-4b14-a5fb-686c7a669d40/volumes" Dec 03 12:41:33 crc kubenswrapper[4702]: I1203 12:41:33.003096 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.685358 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pbxps/crc-debug-72llf"] Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686521 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686540 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686583 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686591 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686616 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="extract-utilities" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686647 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="extract-utilities" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686661 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="extract-content" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686668 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="extract-content" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686680 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="extract-content" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686687 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="extract-content" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686728 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686736 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686750 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="extract-utilities" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686772 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="extract-utilities" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686787 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="extract-content" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686796 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="extract-content" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686819 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686826 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: E1203 12:41:40.686848 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="extract-utilities" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.686857 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="extract-utilities" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.687174 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f228995-3447-4e6c-bff8-81d1a1a2f8d2" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.687220 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.687237 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="d370ba60-2ec7-4904-8ada-85984ae3582b" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.687248 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="c55aad18-42f1-4b14-a5fb-686c7a669d40" containerName="registry-server" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.689086 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.789354 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z76vp\" (UniqueName: \"kubernetes.io/projected/2f1c19b8-cf48-44ba-ade8-152590a563a9-kube-api-access-z76vp\") pod \"crc-debug-72llf\" (UID: \"2f1c19b8-cf48-44ba-ade8-152590a563a9\") " pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.791166 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f1c19b8-cf48-44ba-ade8-152590a563a9-host\") pod \"crc-debug-72llf\" (UID: \"2f1c19b8-cf48-44ba-ade8-152590a563a9\") " pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.894184 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z76vp\" (UniqueName: \"kubernetes.io/projected/2f1c19b8-cf48-44ba-ade8-152590a563a9-kube-api-access-z76vp\") pod \"crc-debug-72llf\" (UID: \"2f1c19b8-cf48-44ba-ade8-152590a563a9\") " pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.894528 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f1c19b8-cf48-44ba-ade8-152590a563a9-host\") pod \"crc-debug-72llf\" (UID: \"2f1c19b8-cf48-44ba-ade8-152590a563a9\") " pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.894909 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f1c19b8-cf48-44ba-ade8-152590a563a9-host\") pod \"crc-debug-72llf\" (UID: \"2f1c19b8-cf48-44ba-ade8-152590a563a9\") " pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:41:40 crc kubenswrapper[4702]: I1203 12:41:40.924048 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z76vp\" (UniqueName: \"kubernetes.io/projected/2f1c19b8-cf48-44ba-ade8-152590a563a9-kube-api-access-z76vp\") pod \"crc-debug-72llf\" (UID: \"2f1c19b8-cf48-44ba-ade8-152590a563a9\") " pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:41:41 crc kubenswrapper[4702]: I1203 12:41:41.013847 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:41:41 crc kubenswrapper[4702]: I1203 12:41:41.105325 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/crc-debug-72llf" event={"ID":"2f1c19b8-cf48-44ba-ade8-152590a563a9","Type":"ContainerStarted","Data":"1dcf08e2239d94b9198fcc9d95e5d0a57d32446e53739e2a7fbd43eef4b2e4f0"} Dec 03 12:41:48 crc kubenswrapper[4702]: I1203 12:41:48.991163 4702 trace.go:236] Trace[1196304243]: "Calculate volume metrics of cni-binary-copy for pod openshift-multus/multus-additional-cni-plugins-z8lld" (03-Dec-2025 12:41:47.311) (total time: 1678ms): Dec 03 12:41:48 crc kubenswrapper[4702]: Trace[1196304243]: [1.678881478s] [1.678881478s] END Dec 03 12:41:49 crc kubenswrapper[4702]: I1203 12:41:49.011474 4702 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.206390215s: [/var/lib/containers/storage/overlay/3a689f8a8a21ae57f6733ad5053dce13598951878f3d8827879f839f2c1e101c/diff /var/log/pods/openstack_nova-cell0-conductor-0_9f7dd418-9620-49e3-8eaf-6aa1d4a1434b/nova-cell0-conductor-conductor/0.log]; will not log again for this container unless duration exceeds 2s Dec 03 12:41:50 crc kubenswrapper[4702]: I1203 12:41:50.004695 4702 trace.go:236] Trace[182758514]: "Calculate volume metrics of storage for pod minio-dev/minio" (03-Dec-2025 12:41:47.839) (total time: 2165ms): Dec 03 12:41:50 crc kubenswrapper[4702]: Trace[182758514]: [2.165127236s] [2.165127236s] END Dec 03 12:41:56 crc kubenswrapper[4702]: I1203 12:41:56.628024 4702 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-vchd7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:41:56 crc kubenswrapper[4702]: I1203 12:41:56.628692 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:41:56 crc kubenswrapper[4702]: I1203 12:41:56.670017 4702 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-vchd7 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:41:56 crc kubenswrapper[4702]: I1203 12:41:56.670108 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vchd7" podUID="a49f8d97-9fa5-44b6-bd39-e35d4d70b33c" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 12:41:56 crc kubenswrapper[4702]: I1203 12:41:56.905266 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:41:56 crc kubenswrapper[4702]: I1203 12:41:56.905339 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:41:56 crc kubenswrapper[4702]: I1203 12:41:56.905534 4702 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xxj8s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:41:56 crc kubenswrapper[4702]: I1203 12:41:56.905627 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xxj8s" podUID="0276c6fb-ba7a-459f-9610-34a03593669b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.43:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:42:06 crc kubenswrapper[4702]: I1203 12:42:06.565427 4702 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6kdsp container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 12:42:06 crc kubenswrapper[4702]: I1203 12:42:06.566024 4702 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6kdsp" podUID="3de04148-0009-427b-8055-a1c5dadb8274" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 12:42:08 crc kubenswrapper[4702]: I1203 12:42:08.716453 4702 generic.go:334] "Generic (PLEG): container finished" podID="56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff" containerID="e36df22670a6d359cb2ddc47ec0abc22c2b0001877dbfb6610c77f76e758779e" exitCode=0 Dec 03 12:42:08 crc kubenswrapper[4702]: I1203 12:42:08.716548 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" event={"ID":"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff","Type":"ContainerDied","Data":"e36df22670a6d359cb2ddc47ec0abc22c2b0001877dbfb6610c77f76e758779e"} Dec 03 12:42:09 crc kubenswrapper[4702]: E1203 12:42:09.597298 4702 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Dec 03 12:42:09 crc kubenswrapper[4702]: E1203 12:42:09.598822 4702 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z76vp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-72llf_openshift-must-gather-pbxps(2f1c19b8-cf48-44ba-ade8-152590a563a9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 12:42:09 crc kubenswrapper[4702]: E1203 12:42:09.599943 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-pbxps/crc-debug-72llf" podUID="2f1c19b8-cf48-44ba-ade8-152590a563a9" Dec 03 12:42:09 crc kubenswrapper[4702]: E1203 12:42:09.736037 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-pbxps/crc-debug-72llf" podUID="2f1c19b8-cf48-44ba-ade8-152590a563a9" Dec 03 12:42:10 crc kubenswrapper[4702]: I1203 12:42:10.745981 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" event={"ID":"56b7fbc7-795b-4a66-b427-6d4fd0cdf0ff","Type":"ContainerStarted","Data":"7c61fc6168cde6cd34c3d306c329d1db5b10bd9891403d2833211d5e9c0e262d"} Dec 03 12:42:24 crc kubenswrapper[4702]: I1203 12:42:24.915354 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/crc-debug-72llf" event={"ID":"2f1c19b8-cf48-44ba-ade8-152590a563a9","Type":"ContainerStarted","Data":"2aac4042f74b4e23cbbbcb44d2635462929d9ab4242859650d201adf39ad0771"} Dec 03 12:42:24 crc kubenswrapper[4702]: I1203 12:42:24.943576 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pbxps/crc-debug-72llf" podStartSLOduration=2.546101067 podStartE2EDuration="44.943552199s" podCreationTimestamp="2025-12-03 12:41:40 +0000 UTC" firstStartedPulling="2025-12-03 12:41:41.076858565 +0000 UTC m=+5884.912787029" lastFinishedPulling="2025-12-03 12:42:23.474309697 +0000 UTC m=+5927.310238161" observedRunningTime="2025-12-03 12:42:24.93309017 +0000 UTC m=+5928.769018634" watchObservedRunningTime="2025-12-03 12:42:24.943552199 +0000 UTC m=+5928.779480673" Dec 03 12:42:25 crc kubenswrapper[4702]: I1203 12:42:25.865869 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 12:42:25 crc kubenswrapper[4702]: I1203 12:42:25.866222 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 12:42:45 crc kubenswrapper[4702]: I1203 12:42:45.872789 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 12:42:45 crc kubenswrapper[4702]: I1203 12:42:45.879413 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6975dd785d-5bvc2" Dec 03 12:43:11 crc kubenswrapper[4702]: I1203 12:43:11.558401 4702 generic.go:334] "Generic (PLEG): container finished" podID="2f1c19b8-cf48-44ba-ade8-152590a563a9" containerID="2aac4042f74b4e23cbbbcb44d2635462929d9ab4242859650d201adf39ad0771" exitCode=0 Dec 03 12:43:11 crc kubenswrapper[4702]: I1203 12:43:11.558482 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/crc-debug-72llf" event={"ID":"2f1c19b8-cf48-44ba-ade8-152590a563a9","Type":"ContainerDied","Data":"2aac4042f74b4e23cbbbcb44d2635462929d9ab4242859650d201adf39ad0771"} Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.706511 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.725563 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z76vp\" (UniqueName: \"kubernetes.io/projected/2f1c19b8-cf48-44ba-ade8-152590a563a9-kube-api-access-z76vp\") pod \"2f1c19b8-cf48-44ba-ade8-152590a563a9\" (UID: \"2f1c19b8-cf48-44ba-ade8-152590a563a9\") " Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.725662 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f1c19b8-cf48-44ba-ade8-152590a563a9-host\") pod \"2f1c19b8-cf48-44ba-ade8-152590a563a9\" (UID: \"2f1c19b8-cf48-44ba-ade8-152590a563a9\") " Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.725893 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c19b8-cf48-44ba-ade8-152590a563a9-host" (OuterVolumeSpecName: "host") pod "2f1c19b8-cf48-44ba-ade8-152590a563a9" (UID: "2f1c19b8-cf48-44ba-ade8-152590a563a9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.726831 4702 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2f1c19b8-cf48-44ba-ade8-152590a563a9-host\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.735772 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1c19b8-cf48-44ba-ade8-152590a563a9-kube-api-access-z76vp" (OuterVolumeSpecName: "kube-api-access-z76vp") pod "2f1c19b8-cf48-44ba-ade8-152590a563a9" (UID: "2f1c19b8-cf48-44ba-ade8-152590a563a9"). InnerVolumeSpecName "kube-api-access-z76vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.756374 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pbxps/crc-debug-72llf"] Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.769540 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pbxps/crc-debug-72llf"] Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.829051 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z76vp\" (UniqueName: \"kubernetes.io/projected/2f1c19b8-cf48-44ba-ade8-152590a563a9-kube-api-access-z76vp\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:12 crc kubenswrapper[4702]: I1203 12:43:12.945795 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1c19b8-cf48-44ba-ade8-152590a563a9" path="/var/lib/kubelet/pods/2f1c19b8-cf48-44ba-ade8-152590a563a9/volumes" Dec 03 12:43:13 crc kubenswrapper[4702]: I1203 12:43:13.601875 4702 scope.go:117] "RemoveContainer" containerID="2aac4042f74b4e23cbbbcb44d2635462929d9ab4242859650d201adf39ad0771" Dec 03 12:43:13 crc kubenswrapper[4702]: I1203 12:43:13.602190 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-72llf" Dec 03 12:43:13 crc kubenswrapper[4702]: I1203 12:43:13.960072 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pbxps/crc-debug-7d4gw"] Dec 03 12:43:13 crc kubenswrapper[4702]: E1203 12:43:13.960897 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c19b8-cf48-44ba-ade8-152590a563a9" containerName="container-00" Dec 03 12:43:13 crc kubenswrapper[4702]: I1203 12:43:13.960912 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c19b8-cf48-44ba-ade8-152590a563a9" containerName="container-00" Dec 03 12:43:13 crc kubenswrapper[4702]: I1203 12:43:13.961184 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c19b8-cf48-44ba-ade8-152590a563a9" containerName="container-00" Dec 03 12:43:13 crc kubenswrapper[4702]: I1203 12:43:13.962144 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:14 crc kubenswrapper[4702]: I1203 12:43:14.066376 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25f4137d-5741-4cd9-9f65-7b3d37d5b638-host\") pod \"crc-debug-7d4gw\" (UID: \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\") " pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:14 crc kubenswrapper[4702]: I1203 12:43:14.067034 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw26j\" (UniqueName: \"kubernetes.io/projected/25f4137d-5741-4cd9-9f65-7b3d37d5b638-kube-api-access-hw26j\") pod \"crc-debug-7d4gw\" (UID: \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\") " pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:14 crc kubenswrapper[4702]: I1203 12:43:14.168867 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw26j\" (UniqueName: \"kubernetes.io/projected/25f4137d-5741-4cd9-9f65-7b3d37d5b638-kube-api-access-hw26j\") pod \"crc-debug-7d4gw\" (UID: \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\") " pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:14 crc kubenswrapper[4702]: I1203 12:43:14.169039 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25f4137d-5741-4cd9-9f65-7b3d37d5b638-host\") pod \"crc-debug-7d4gw\" (UID: \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\") " pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:14 crc kubenswrapper[4702]: I1203 12:43:14.169167 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25f4137d-5741-4cd9-9f65-7b3d37d5b638-host\") pod \"crc-debug-7d4gw\" (UID: \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\") " pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:14 crc kubenswrapper[4702]: I1203 12:43:14.187886 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw26j\" (UniqueName: \"kubernetes.io/projected/25f4137d-5741-4cd9-9f65-7b3d37d5b638-kube-api-access-hw26j\") pod \"crc-debug-7d4gw\" (UID: \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\") " pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:14 crc kubenswrapper[4702]: I1203 12:43:14.286367 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:14 crc kubenswrapper[4702]: I1203 12:43:14.619010 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/crc-debug-7d4gw" event={"ID":"25f4137d-5741-4cd9-9f65-7b3d37d5b638","Type":"ContainerStarted","Data":"b46a07216cc53ccfa2467280367a690f75fbe4b73726f793cb1b8f6639170210"} Dec 03 12:43:15 crc kubenswrapper[4702]: I1203 12:43:15.630941 4702 generic.go:334] "Generic (PLEG): container finished" podID="25f4137d-5741-4cd9-9f65-7b3d37d5b638" containerID="6b31c519b95002c132edd89b92d0ecc3aebe03cfdede235379f983d90dbf2208" exitCode=0 Dec 03 12:43:15 crc kubenswrapper[4702]: I1203 12:43:15.631001 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/crc-debug-7d4gw" event={"ID":"25f4137d-5741-4cd9-9f65-7b3d37d5b638","Type":"ContainerDied","Data":"6b31c519b95002c132edd89b92d0ecc3aebe03cfdede235379f983d90dbf2208"} Dec 03 12:43:16 crc kubenswrapper[4702]: I1203 12:43:16.822524 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pbxps/crc-debug-7d4gw"] Dec 03 12:43:16 crc kubenswrapper[4702]: I1203 12:43:16.833316 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pbxps/crc-debug-7d4gw"] Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.466616 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.556279 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25f4137d-5741-4cd9-9f65-7b3d37d5b638-host\") pod \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\" (UID: \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\") " Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.556436 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25f4137d-5741-4cd9-9f65-7b3d37d5b638-host" (OuterVolumeSpecName: "host") pod "25f4137d-5741-4cd9-9f65-7b3d37d5b638" (UID: "25f4137d-5741-4cd9-9f65-7b3d37d5b638"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.556676 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw26j\" (UniqueName: \"kubernetes.io/projected/25f4137d-5741-4cd9-9f65-7b3d37d5b638-kube-api-access-hw26j\") pod \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\" (UID: \"25f4137d-5741-4cd9-9f65-7b3d37d5b638\") " Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.557800 4702 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/25f4137d-5741-4cd9-9f65-7b3d37d5b638-host\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.624984 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f4137d-5741-4cd9-9f65-7b3d37d5b638-kube-api-access-hw26j" (OuterVolumeSpecName: "kube-api-access-hw26j") pod "25f4137d-5741-4cd9-9f65-7b3d37d5b638" (UID: "25f4137d-5741-4cd9-9f65-7b3d37d5b638"). InnerVolumeSpecName "kube-api-access-hw26j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.661949 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw26j\" (UniqueName: \"kubernetes.io/projected/25f4137d-5741-4cd9-9f65-7b3d37d5b638-kube-api-access-hw26j\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.667597 4702 scope.go:117] "RemoveContainer" containerID="6b31c519b95002c132edd89b92d0ecc3aebe03cfdede235379f983d90dbf2208" Dec 03 12:43:17 crc kubenswrapper[4702]: I1203 12:43:17.667669 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-7d4gw" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.547027 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pbxps/crc-debug-rqm4j"] Dec 03 12:43:18 crc kubenswrapper[4702]: E1203 12:43:18.548200 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f4137d-5741-4cd9-9f65-7b3d37d5b638" containerName="container-00" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.548222 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f4137d-5741-4cd9-9f65-7b3d37d5b638" containerName="container-00" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.548607 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f4137d-5741-4cd9-9f65-7b3d37d5b638" containerName="container-00" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.549824 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.691312 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0451f8b5-c0e1-443e-83be-1cae97c6592b-host\") pod \"crc-debug-rqm4j\" (UID: \"0451f8b5-c0e1-443e-83be-1cae97c6592b\") " pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.691852 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlgfq\" (UniqueName: \"kubernetes.io/projected/0451f8b5-c0e1-443e-83be-1cae97c6592b-kube-api-access-tlgfq\") pod \"crc-debug-rqm4j\" (UID: \"0451f8b5-c0e1-443e-83be-1cae97c6592b\") " pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.795675 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0451f8b5-c0e1-443e-83be-1cae97c6592b-host\") pod \"crc-debug-rqm4j\" (UID: \"0451f8b5-c0e1-443e-83be-1cae97c6592b\") " pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.795811 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0451f8b5-c0e1-443e-83be-1cae97c6592b-host\") pod \"crc-debug-rqm4j\" (UID: \"0451f8b5-c0e1-443e-83be-1cae97c6592b\") " pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.795889 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlgfq\" (UniqueName: \"kubernetes.io/projected/0451f8b5-c0e1-443e-83be-1cae97c6592b-kube-api-access-tlgfq\") pod \"crc-debug-rqm4j\" (UID: \"0451f8b5-c0e1-443e-83be-1cae97c6592b\") " pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.820985 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlgfq\" (UniqueName: \"kubernetes.io/projected/0451f8b5-c0e1-443e-83be-1cae97c6592b-kube-api-access-tlgfq\") pod \"crc-debug-rqm4j\" (UID: \"0451f8b5-c0e1-443e-83be-1cae97c6592b\") " pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.870678 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:18 crc kubenswrapper[4702]: W1203 12:43:18.914070 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0451f8b5_c0e1_443e_83be_1cae97c6592b.slice/crio-3e0705a9989926d3bb8b4ffb70e8db647c0c3f56e4d64c31271ea66078aff6aa WatchSource:0}: Error finding container 3e0705a9989926d3bb8b4ffb70e8db647c0c3f56e4d64c31271ea66078aff6aa: Status 404 returned error can't find the container with id 3e0705a9989926d3bb8b4ffb70e8db647c0c3f56e4d64c31271ea66078aff6aa Dec 03 12:43:18 crc kubenswrapper[4702]: I1203 12:43:18.950469 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f4137d-5741-4cd9-9f65-7b3d37d5b638" path="/var/lib/kubelet/pods/25f4137d-5741-4cd9-9f65-7b3d37d5b638/volumes" Dec 03 12:43:19 crc kubenswrapper[4702]: I1203 12:43:19.696183 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/crc-debug-rqm4j" event={"ID":"0451f8b5-c0e1-443e-83be-1cae97c6592b","Type":"ContainerStarted","Data":"3e0705a9989926d3bb8b4ffb70e8db647c0c3f56e4d64c31271ea66078aff6aa"} Dec 03 12:43:20 crc kubenswrapper[4702]: I1203 12:43:20.709678 4702 generic.go:334] "Generic (PLEG): container finished" podID="0451f8b5-c0e1-443e-83be-1cae97c6592b" containerID="7875d019b005520ed89a511fae2dda78f52ce372c0c041f6d574eb041c308b89" exitCode=0 Dec 03 12:43:20 crc kubenswrapper[4702]: I1203 12:43:20.710003 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/crc-debug-rqm4j" event={"ID":"0451f8b5-c0e1-443e-83be-1cae97c6592b","Type":"ContainerDied","Data":"7875d019b005520ed89a511fae2dda78f52ce372c0c041f6d574eb041c308b89"} Dec 03 12:43:20 crc kubenswrapper[4702]: I1203 12:43:20.759851 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pbxps/crc-debug-rqm4j"] Dec 03 12:43:20 crc kubenswrapper[4702]: I1203 12:43:20.774541 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pbxps/crc-debug-rqm4j"] Dec 03 12:43:21 crc kubenswrapper[4702]: I1203 12:43:21.879538 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.029980 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlgfq\" (UniqueName: \"kubernetes.io/projected/0451f8b5-c0e1-443e-83be-1cae97c6592b-kube-api-access-tlgfq\") pod \"0451f8b5-c0e1-443e-83be-1cae97c6592b\" (UID: \"0451f8b5-c0e1-443e-83be-1cae97c6592b\") " Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.030075 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0451f8b5-c0e1-443e-83be-1cae97c6592b-host\") pod \"0451f8b5-c0e1-443e-83be-1cae97c6592b\" (UID: \"0451f8b5-c0e1-443e-83be-1cae97c6592b\") " Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.030162 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0451f8b5-c0e1-443e-83be-1cae97c6592b-host" (OuterVolumeSpecName: "host") pod "0451f8b5-c0e1-443e-83be-1cae97c6592b" (UID: "0451f8b5-c0e1-443e-83be-1cae97c6592b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.031539 4702 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0451f8b5-c0e1-443e-83be-1cae97c6592b-host\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.041192 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0451f8b5-c0e1-443e-83be-1cae97c6592b-kube-api-access-tlgfq" (OuterVolumeSpecName: "kube-api-access-tlgfq") pod "0451f8b5-c0e1-443e-83be-1cae97c6592b" (UID: "0451f8b5-c0e1-443e-83be-1cae97c6592b"). InnerVolumeSpecName "kube-api-access-tlgfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.134353 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlgfq\" (UniqueName: \"kubernetes.io/projected/0451f8b5-c0e1-443e-83be-1cae97c6592b-kube-api-access-tlgfq\") on node \"crc\" DevicePath \"\"" Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.753082 4702 scope.go:117] "RemoveContainer" containerID="7875d019b005520ed89a511fae2dda78f52ce372c0c041f6d574eb041c308b89" Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.753541 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/crc-debug-rqm4j" Dec 03 12:43:22 crc kubenswrapper[4702]: I1203 12:43:22.956012 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0451f8b5-c0e1-443e-83be-1cae97c6592b" path="/var/lib/kubelet/pods/0451f8b5-c0e1-443e-83be-1cae97c6592b/volumes" Dec 03 12:43:25 crc kubenswrapper[4702]: I1203 12:43:25.907950 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:43:25 crc kubenswrapper[4702]: I1203 12:43:25.908416 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:43:50 crc kubenswrapper[4702]: I1203 12:43:50.504436 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_da05b0ac-b62d-4496-bbf4-0aa969a4def4/aodh-api/0.log" Dec 03 12:43:50 crc kubenswrapper[4702]: I1203 12:43:50.561490 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_da05b0ac-b62d-4496-bbf4-0aa969a4def4/aodh-evaluator/0.log" Dec 03 12:43:50 crc kubenswrapper[4702]: I1203 12:43:50.736992 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_da05b0ac-b62d-4496-bbf4-0aa969a4def4/aodh-listener/0.log" Dec 03 12:43:50 crc kubenswrapper[4702]: I1203 12:43:50.772667 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_da05b0ac-b62d-4496-bbf4-0aa969a4def4/aodh-notifier/0.log" Dec 03 12:43:50 crc kubenswrapper[4702]: I1203 12:43:50.877843 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-76f68f8c78-w8n8j_36f29f89-b01b-4656-ba86-a2f731d0c1e0/barbican-api/0.log" Dec 03 12:43:50 crc kubenswrapper[4702]: I1203 12:43:50.990943 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-76f68f8c78-w8n8j_36f29f89-b01b-4656-ba86-a2f731d0c1e0/barbican-api-log/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.084071 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b9bd8bd96-9mqbl_8600900e-a4f2-484b-8e66-be0b81303777/barbican-keystone-listener/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.168114 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5b9bd8bd96-9mqbl_8600900e-a4f2-484b-8e66-be0b81303777/barbican-keystone-listener-log/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.259729 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-554bc66f45-ddpl4_53d4b133-1553-4265-9529-f7237cbe87e6/barbican-worker/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.304041 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-554bc66f45-ddpl4_53d4b133-1553-4265-9529-f7237cbe87e6/barbican-worker-log/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.509138 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-6584l_ab0384ac-759e-45a9-99c3-39206af6a0b8/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.558817 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_718cbf0b-d35e-4261-9cec-2ad3bc742998/ceilometer-central-agent/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.702264 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_718cbf0b-d35e-4261-9cec-2ad3bc742998/ceilometer-notification-agent/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.760749 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_718cbf0b-d35e-4261-9cec-2ad3bc742998/proxy-httpd/0.log" Dec 03 12:43:51 crc kubenswrapper[4702]: I1203 12:43:51.762132 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_718cbf0b-d35e-4261-9cec-2ad3bc742998/sg-core/0.log" Dec 03 12:43:52 crc kubenswrapper[4702]: I1203 12:43:52.064691 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8e07dc54-499c-470d-9e1b-4775b3ec0ba6/cinder-api-log/0.log" Dec 03 12:43:52 crc kubenswrapper[4702]: I1203 12:43:52.118741 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_8e07dc54-499c-470d-9e1b-4775b3ec0ba6/cinder-api/0.log" Dec 03 12:43:52 crc kubenswrapper[4702]: I1203 12:43:52.140947 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_58148f49-2721-4a0a-a5e0-38a2aa23522b/cinder-scheduler/1.log" Dec 03 12:43:53 crc kubenswrapper[4702]: I1203 12:43:53.133045 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_58148f49-2721-4a0a-a5e0-38a2aa23522b/cinder-scheduler/0.log" Dec 03 12:43:53 crc kubenswrapper[4702]: I1203 12:43:53.155225 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_58148f49-2721-4a0a-a5e0-38a2aa23522b/probe/0.log" Dec 03 12:43:53 crc kubenswrapper[4702]: I1203 12:43:53.731319 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-jxwlk_bae49358-6209-44ef-b28e-bd0b48fc617a/init/0.log" Dec 03 12:43:53 crc kubenswrapper[4702]: I1203 12:43:53.737560 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-smqtz_961e241a-3dbc-4c96-afcd-6b768d0322db/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:53 crc kubenswrapper[4702]: I1203 12:43:53.743414 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-8m2dl_b236cac1-567b-4e23-9823-861d30c1793d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:53 crc kubenswrapper[4702]: I1203 12:43:53.913286 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-jxwlk_bae49358-6209-44ef-b28e-bd0b48fc617a/init/0.log" Dec 03 12:43:53 crc kubenswrapper[4702]: I1203 12:43:53.988245 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-jxwlk_bae49358-6209-44ef-b28e-bd0b48fc617a/dnsmasq-dns/0.log" Dec 03 12:43:54 crc kubenswrapper[4702]: I1203 12:43:54.095876 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-9fsdr_e6a36b67-a7da-4684-8b3a-57735f2e4c8d/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:54 crc kubenswrapper[4702]: I1203 12:43:54.182256 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_eafb11cf-a4c9-4744-822d-6ccefe624f89/glance-httpd/0.log" Dec 03 12:43:54 crc kubenswrapper[4702]: I1203 12:43:54.208053 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_eafb11cf-a4c9-4744-822d-6ccefe624f89/glance-log/0.log" Dec 03 12:43:54 crc kubenswrapper[4702]: I1203 12:43:54.352635 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_381fd572-826d-4e69-ad36-f90b539f21ab/glance-log/0.log" Dec 03 12:43:54 crc kubenswrapper[4702]: I1203 12:43:54.388008 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_381fd572-826d-4e69-ad36-f90b539f21ab/glance-httpd/0.log" Dec 03 12:43:54 crc kubenswrapper[4702]: I1203 12:43:54.765441 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-79b68c69ff-kvztw_a30c0f33-d7bc-456c-be27-26e860ca8f28/heat-engine/0.log" Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.129625 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6b7dd55484-jsksg_31eea135-16c2-46e7-860b-418c61ef127e/heat-api/0.log" Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.246511 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-56cbbf589-6mqq4_92e38875-6121-4c92-b21b-62a280aa8948/heat-cfnapi/0.log" Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.425682 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-njkvt_4064b583-7ee6-4ca3-9720-77129e43d3b9/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.476908 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fchg5_6c2b0167-387a-48e4-9931-1869990ede5e/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.674698 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412721-m9zdz_f925247e-4f37-4a2d-9873-0b68308d6e3c/keystone-cron/0.log" Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.787362 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_30059ea4-152f-420c-b8cc-234ebab96b47/kube-state-metrics/0.log" Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.907783 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.907854 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:43:55 crc kubenswrapper[4702]: I1203 12:43:55.992207 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-llqpf_240da10f-8cde-4000-a815-93bdeeb2af78/logging-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:56 crc kubenswrapper[4702]: I1203 12:43:56.042386 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-m6fk4_145c8ea5-5aaf-4017-8416-e346f9c95523/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:56 crc kubenswrapper[4702]: I1203 12:43:56.344869 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_941c2f85-794d-4361-942f-1d264fb98b7d/mysqld-exporter/0.log" Dec 03 12:43:56 crc kubenswrapper[4702]: I1203 12:43:56.692070 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5fdfb45b77-sfz9f_6889403a-b787-4401-a235-0f8297e5844f/neutron-httpd/0.log" Dec 03 12:43:56 crc kubenswrapper[4702]: I1203 12:43:56.840044 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5fdfb45b77-sfz9f_6889403a-b787-4401-a235-0f8297e5844f/neutron-api/0.log" Dec 03 12:43:56 crc kubenswrapper[4702]: I1203 12:43:56.965712 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-29gxg_b7bb0467-80d8-4cb2-a515-94eba3b4acdc/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:58 crc kubenswrapper[4702]: I1203 12:43:58.213138 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7455976c-e312-4b2a-963f-6e75d428c41c/nova-api-log/0.log" Dec 03 12:43:58 crc kubenswrapper[4702]: I1203 12:43:58.699384 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9f7dd418-9620-49e3-8eaf-6aa1d4a1434b/nova-cell0-conductor-conductor/0.log" Dec 03 12:43:58 crc kubenswrapper[4702]: I1203 12:43:58.975403 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_7455976c-e312-4b2a-963f-6e75d428c41c/nova-api-api/0.log" Dec 03 12:43:59 crc kubenswrapper[4702]: I1203 12:43:59.157083 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b9d234e0-f641-485a-90aa-8440c5d00296/nova-cell1-conductor-conductor/0.log" Dec 03 12:43:59 crc kubenswrapper[4702]: I1203 12:43:59.418107 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ee5b7802-3239-45f9-9b0b-99348615d8bd/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 12:43:59 crc kubenswrapper[4702]: I1203 12:43:59.540348 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lftxr_4332bde3-4d31-498c-8fd3-d1bc3d9e3794/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:43:59 crc kubenswrapper[4702]: I1203 12:43:59.778557 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eef79cba-d523-4825-9524-26f53553b618/nova-metadata-log/0.log" Dec 03 12:44:00 crc kubenswrapper[4702]: I1203 12:44:00.283217 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_810e51fd-c7d4-4d56-9baf-a1abcad8b348/nova-scheduler-scheduler/0.log" Dec 03 12:44:00 crc kubenswrapper[4702]: I1203 12:44:00.414801 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c91e1dc8-ef80-407f-ac34-4c9ab29026f7/mysql-bootstrap/0.log" Dec 03 12:44:00 crc kubenswrapper[4702]: I1203 12:44:00.632385 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c91e1dc8-ef80-407f-ac34-4c9ab29026f7/mysql-bootstrap/0.log" Dec 03 12:44:00 crc kubenswrapper[4702]: I1203 12:44:00.691046 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c91e1dc8-ef80-407f-ac34-4c9ab29026f7/galera/1.log" Dec 03 12:44:00 crc kubenswrapper[4702]: I1203 12:44:00.789210 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-694848776d-v64gd_b7119b1c-f3d7-44d8-9c5a-c8e8c4cfdfcd/keystone-api/0.log" Dec 03 12:44:00 crc kubenswrapper[4702]: I1203 12:44:00.991110 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_c91e1dc8-ef80-407f-ac34-4c9ab29026f7/galera/0.log" Dec 03 12:44:01 crc kubenswrapper[4702]: I1203 12:44:01.343855 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_92266ac3-f0a6-4e68-9e88-9aa2900e1fe3/mysql-bootstrap/0.log" Dec 03 12:44:01 crc kubenswrapper[4702]: I1203 12:44:01.469457 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_92266ac3-f0a6-4e68-9e88-9aa2900e1fe3/mysql-bootstrap/0.log" Dec 03 12:44:01 crc kubenswrapper[4702]: I1203 12:44:01.595044 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_92266ac3-f0a6-4e68-9e88-9aa2900e1fe3/galera/0.log" Dec 03 12:44:01 crc kubenswrapper[4702]: I1203 12:44:01.603098 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_92266ac3-f0a6-4e68-9e88-9aa2900e1fe3/galera/1.log" Dec 03 12:44:01 crc kubenswrapper[4702]: I1203 12:44:01.808501 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1b30e6f6-6da6-48ea-8e02-873d566d7719/openstackclient/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.030866 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5v7lw_e77b1727-1835-42aa-a4f6-d902ff001d20/ovn-controller/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.130492 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rwfzx_59e4fe73-960d-4021-bbda-ce3ba11e72be/openstack-network-exporter/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.357486 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_eef79cba-d523-4825-9524-26f53553b618/nova-metadata-metadata/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.368182 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zqdp5_163cc47f-d241-4b3e-bf62-07f49047de5d/ovsdb-server-init/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.541936 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zqdp5_163cc47f-d241-4b3e-bf62-07f49047de5d/ovsdb-server-init/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.591720 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zqdp5_163cc47f-d241-4b3e-bf62-07f49047de5d/ovsdb-server/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.624537 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-zqdp5_163cc47f-d241-4b3e-bf62-07f49047de5d/ovs-vswitchd/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.847227 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-jwdsh_81b2b3f5-ec77-41d7-83e4-6f5a3fc25fc4/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.906484 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9/openstack-network-exporter/0.log" Dec 03 12:44:02 crc kubenswrapper[4702]: I1203 12:44:02.985660 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9/ovn-northd/1.log" Dec 03 12:44:03 crc kubenswrapper[4702]: I1203 12:44:03.164319 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6c6cf3a6-52d0-4e0f-96c3-ae7b3a47efd9/ovn-northd/0.log" Dec 03 12:44:03 crc kubenswrapper[4702]: I1203 12:44:03.257258 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_406550ad-e61e-4af5-a42e-4e1437958f90/openstack-network-exporter/0.log" Dec 03 12:44:03 crc kubenswrapper[4702]: I1203 12:44:03.266060 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_406550ad-e61e-4af5-a42e-4e1437958f90/ovsdbserver-nb/0.log" Dec 03 12:44:03 crc kubenswrapper[4702]: I1203 12:44:03.526121 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd806fd6-9deb-4a6d-8e73-e486e1b2cba7/openstack-network-exporter/0.log" Dec 03 12:44:03 crc kubenswrapper[4702]: I1203 12:44:03.545424 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_bd806fd6-9deb-4a6d-8e73-e486e1b2cba7/ovsdbserver-sb/0.log" Dec 03 12:44:03 crc kubenswrapper[4702]: I1203 12:44:03.830149 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57f75d96b4-7bvsq_4785be1d-0e87-49ae-b5de-56bbab3b5eff/placement-api/0.log" Dec 03 12:44:03 crc kubenswrapper[4702]: I1203 12:44:03.921090 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_90e9786f-3e0d-4a23-b624-b49a3d386784/init-config-reloader/0.log" Dec 03 12:44:03 crc kubenswrapper[4702]: I1203 12:44:03.940700 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-57f75d96b4-7bvsq_4785be1d-0e87-49ae-b5de-56bbab3b5eff/placement-log/0.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.110922 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_90e9786f-3e0d-4a23-b624-b49a3d386784/init-config-reloader/0.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.169984 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_90e9786f-3e0d-4a23-b624-b49a3d386784/config-reloader/0.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.191895 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_90e9786f-3e0d-4a23-b624-b49a3d386784/prometheus/1.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.213223 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_90e9786f-3e0d-4a23-b624-b49a3d386784/prometheus/0.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.358808 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_90e9786f-3e0d-4a23-b624-b49a3d386784/thanos-sidecar/0.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.493507 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bda57dc3-3be8-4feb-a987-62c1412de0ad/setup-container/0.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.773519 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bda57dc3-3be8-4feb-a987-62c1412de0ad/setup-container/0.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.873095 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_33f03183-33e1-4aa1-8a4c-11f8b75297cd/setup-container/0.log" Dec 03 12:44:04 crc kubenswrapper[4702]: I1203 12:44:04.890851 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bda57dc3-3be8-4feb-a987-62c1412de0ad/rabbitmq/0.log" Dec 03 12:44:05 crc kubenswrapper[4702]: I1203 12:44:05.063127 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_33f03183-33e1-4aa1-8a4c-11f8b75297cd/setup-container/0.log" Dec 03 12:44:05 crc kubenswrapper[4702]: I1203 12:44:05.089883 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-b4w25_62a60a41-35bc-45db-91c0-feb1ec993942/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:44:05 crc kubenswrapper[4702]: I1203 12:44:05.109744 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_33f03183-33e1-4aa1-8a4c-11f8b75297cd/rabbitmq/0.log" Dec 03 12:44:05 crc kubenswrapper[4702]: I1203 12:44:05.448624 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-wjztq_90df2c2b-def7-4eda-896e-a551bfecb98c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:44:05 crc kubenswrapper[4702]: I1203 12:44:05.510602 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-zvf7j_f52f9dbe-66ac-479c-b673-7fa2fbaccf71/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:44:05 crc kubenswrapper[4702]: I1203 12:44:05.674603 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-k9m29_a34690c3-fe9d-4dac-841c-07298c80c0e8/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:44:05 crc kubenswrapper[4702]: I1203 12:44:05.859017 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-l4zf4_4b86b10a-a947-4b55-b723-5542cd398eaf/ssh-known-hosts-edpm-deployment/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.076288 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-868d57787f-bntsv_57179545-ef9c-460f-9ef6-219c895dc9fa/proxy-server/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.178215 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-868d57787f-bntsv_57179545-ef9c-460f-9ef6-219c895dc9fa/proxy-httpd/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.183190 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-whljp_dcc463a3-c5e5-443e-98d1-306cc779e62e/swift-ring-rebalance/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.405974 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/account-auditor/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.438717 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/account-reaper/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.502786 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/account-replicator/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.655449 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/account-server/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.722714 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/container-auditor/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.726754 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/container-server/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.743468 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/container-replicator/0.log" Dec 03 12:44:06 crc kubenswrapper[4702]: I1203 12:44:06.854429 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/container-updater/0.log" Dec 03 12:44:07 crc kubenswrapper[4702]: I1203 12:44:07.853669 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/object-expirer/0.log" Dec 03 12:44:07 crc kubenswrapper[4702]: I1203 12:44:07.885673 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/object-replicator/0.log" Dec 03 12:44:07 crc kubenswrapper[4702]: I1203 12:44:07.925015 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/object-auditor/0.log" Dec 03 12:44:07 crc kubenswrapper[4702]: I1203 12:44:07.956579 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/object-server/0.log" Dec 03 12:44:08 crc kubenswrapper[4702]: I1203 12:44:08.116113 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/object-updater/0.log" Dec 03 12:44:08 crc kubenswrapper[4702]: I1203 12:44:08.126446 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/rsync/0.log" Dec 03 12:44:08 crc kubenswrapper[4702]: I1203 12:44:08.199665 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_3892571c-86ff-4259-beaa-6033dcfda204/swift-recon-cron/0.log" Dec 03 12:44:08 crc kubenswrapper[4702]: I1203 12:44:08.422417 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-pwfbj_660e4478-3f02-41ec-9e7d-4cd6067ec6cf/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:44:08 crc kubenswrapper[4702]: I1203 12:44:08.513419 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-b68zm_87bf58e6-ad4c-4d7a-94e1-0c1e2715a386/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:44:08 crc kubenswrapper[4702]: I1203 12:44:08.830322 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_73a3c36b-a281-4c22-a827-8aa59d607739/test-operator-logs-container/0.log" Dec 03 12:44:09 crc kubenswrapper[4702]: I1203 12:44:09.046170 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xhnxs_1cb3e758-d316-42dd-97a7-c2fe38a57158/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 12:44:09 crc kubenswrapper[4702]: I1203 12:44:09.274555 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_a35fd719-e341-49b9-b12f-f39f2402868b/tempest-tests-tempest-tests-runner/0.log" Dec 03 12:44:23 crc kubenswrapper[4702]: I1203 12:44:23.067135 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_8ea851b4-124d-4472-9fd0-7b584da44ecc/memcached/0.log" Dec 03 12:44:25 crc kubenswrapper[4702]: I1203 12:44:25.908250 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:44:25 crc kubenswrapper[4702]: I1203 12:44:25.909604 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:44:25 crc kubenswrapper[4702]: I1203 12:44:25.909951 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:44:25 crc kubenswrapper[4702]: I1203 12:44:25.911062 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:44:25 crc kubenswrapper[4702]: I1203 12:44:25.911211 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" gracePeriod=600 Dec 03 12:44:26 crc kubenswrapper[4702]: E1203 12:44:26.062570 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:44:26 crc kubenswrapper[4702]: I1203 12:44:26.669844 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" exitCode=0 Dec 03 12:44:26 crc kubenswrapper[4702]: I1203 12:44:26.669895 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a"} Dec 03 12:44:26 crc kubenswrapper[4702]: I1203 12:44:26.669939 4702 scope.go:117] "RemoveContainer" containerID="030d6a62e307050ecc85543445f98f0ffef93a52b32474be21649714e5532e7a" Dec 03 12:44:26 crc kubenswrapper[4702]: I1203 12:44:26.670936 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:44:26 crc kubenswrapper[4702]: E1203 12:44:26.671506 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:44:40 crc kubenswrapper[4702]: I1203 12:44:40.928924 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:44:40 crc kubenswrapper[4702]: E1203 12:44:40.929731 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:44:44 crc kubenswrapper[4702]: I1203 12:44:44.646499 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9_b452d48a-0cf2-4958-8c23-32ed4d808c7b/util/0.log" Dec 03 12:44:44 crc kubenswrapper[4702]: I1203 12:44:44.850309 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9_b452d48a-0cf2-4958-8c23-32ed4d808c7b/util/0.log" Dec 03 12:44:44 crc kubenswrapper[4702]: I1203 12:44:44.902336 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9_b452d48a-0cf2-4958-8c23-32ed4d808c7b/pull/0.log" Dec 03 12:44:44 crc kubenswrapper[4702]: I1203 12:44:44.910374 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9_b452d48a-0cf2-4958-8c23-32ed4d808c7b/pull/0.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.184227 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9_b452d48a-0cf2-4958-8c23-32ed4d808c7b/util/0.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.236392 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9_b452d48a-0cf2-4958-8c23-32ed4d808c7b/extract/0.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.248897 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_389cdc0f67cc74332d7e56ab6f2a768988fde617365ff87a48ceeb17d7pqhp9_b452d48a-0cf2-4958-8c23-32ed4d808c7b/pull/0.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.398303 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-m5trg_8f6320ff-4661-46be-80e1-8d97f09fe789/kube-rbac-proxy/0.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.571812 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-m5trg_8f6320ff-4661-46be-80e1-8d97f09fe789/manager/1.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.596080 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-m5trg_8f6320ff-4661-46be-80e1-8d97f09fe789/manager/0.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.667740 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-w2vmt_182ca1cb-9499-4cf7-aeae-c35c7038814c/kube-rbac-proxy/0.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.847039 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-w2vmt_182ca1cb-9499-4cf7-aeae-c35c7038814c/manager/1.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.941232 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-w2vmt_182ca1cb-9499-4cf7-aeae-c35c7038814c/manager/0.log" Dec 03 12:44:45 crc kubenswrapper[4702]: I1203 12:44:45.960458 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-htxmz_530ef793-9485-4c45-86ba-531906f2085a/kube-rbac-proxy/0.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.148862 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-htxmz_530ef793-9485-4c45-86ba-531906f2085a/manager/1.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.154740 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-htxmz_530ef793-9485-4c45-86ba-531906f2085a/manager/0.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.307277 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-lp88c_62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d/kube-rbac-proxy/0.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.524306 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-lp88c_62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d/manager/0.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.527058 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-lp88c_62e110bb-ccd1-4cc8-8255-af2e6f0b3a6d/manager/1.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.607772 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-4pkkr_224e5de0-3f58-4243-80e5-212cf016ea46/kube-rbac-proxy/0.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.874277 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-4pkkr_224e5de0-3f58-4243-80e5-212cf016ea46/manager/1.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.907806 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-kg6p7_4b90477f-d1b5-4f03-ab08-2476d44a9cff/kube-rbac-proxy/0.log" Dec 03 12:44:46 crc kubenswrapper[4702]: I1203 12:44:46.945421 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-4pkkr_224e5de0-3f58-4243-80e5-212cf016ea46/manager/0.log" Dec 03 12:44:47 crc kubenswrapper[4702]: I1203 12:44:47.134253 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-kg6p7_4b90477f-d1b5-4f03-ab08-2476d44a9cff/manager/0.log" Dec 03 12:44:47 crc kubenswrapper[4702]: I1203 12:44:47.163671 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-kg6p7_4b90477f-d1b5-4f03-ab08-2476d44a9cff/manager/1.log" Dec 03 12:44:47 crc kubenswrapper[4702]: I1203 12:44:47.236987 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-98mxd_a7faac4b-b558-4106-af27-4daf6a1db1af/kube-rbac-proxy/0.log" Dec 03 12:44:47 crc kubenswrapper[4702]: I1203 12:44:47.445265 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-98mxd_a7faac4b-b558-4106-af27-4daf6a1db1af/manager/1.log" Dec 03 12:44:47 crc kubenswrapper[4702]: I1203 12:44:47.742967 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-gqqgw_1a7e4f08-8a48-44d5-944b-4eaf9d9518b5/manager/1.log" Dec 03 12:44:47 crc kubenswrapper[4702]: I1203 12:44:47.770867 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-gqqgw_1a7e4f08-8a48-44d5-944b-4eaf9d9518b5/kube-rbac-proxy/0.log" Dec 03 12:44:47 crc kubenswrapper[4702]: I1203 12:44:47.799787 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-gqqgw_1a7e4f08-8a48-44d5-944b-4eaf9d9518b5/manager/0.log" Dec 03 12:44:48 crc kubenswrapper[4702]: I1203 12:44:48.995108 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-hpf6t_9b295e92-630f-4544-b741-50ece5e79f4c/kube-rbac-proxy/0.log" Dec 03 12:44:49 crc kubenswrapper[4702]: I1203 12:44:49.187566 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-hpf6t_9b295e92-630f-4544-b741-50ece5e79f4c/manager/1.log" Dec 03 12:44:49 crc kubenswrapper[4702]: I1203 12:44:49.242731 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-98mxd_a7faac4b-b558-4106-af27-4daf6a1db1af/manager/0.log" Dec 03 12:44:49 crc kubenswrapper[4702]: I1203 12:44:49.285528 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-hpf6t_9b295e92-630f-4544-b741-50ece5e79f4c/manager/0.log" Dec 03 12:44:49 crc kubenswrapper[4702]: I1203 12:44:49.633851 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-hxvr6_e3c1b694-60b8-4b5d-b8d5-40418e60aa4b/kube-rbac-proxy/0.log" Dec 03 12:44:49 crc kubenswrapper[4702]: I1203 12:44:49.804024 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-2pcqv_5cecb29f-7ef9-4177-8e01-a776b70bbb03/kube-rbac-proxy/0.log" Dec 03 12:44:49 crc kubenswrapper[4702]: I1203 12:44:49.805079 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-hxvr6_e3c1b694-60b8-4b5d-b8d5-40418e60aa4b/manager/0.log" Dec 03 12:44:49 crc kubenswrapper[4702]: I1203 12:44:49.960713 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-2pcqv_5cecb29f-7ef9-4177-8e01-a776b70bbb03/manager/1.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.058718 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-2pcqv_5cecb29f-7ef9-4177-8e01-a776b70bbb03/manager/0.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.140520 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vz7gf_84fc908a-9418-4e6e-ac17-9e725524f9ce/kube-rbac-proxy/0.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.193293 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vz7gf_84fc908a-9418-4e6e-ac17-9e725524f9ce/manager/1.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.276276 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-vz7gf_84fc908a-9418-4e6e-ac17-9e725524f9ce/manager/0.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.405960 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-m2bfb_5e7b4134-2b34-4b36-ad61-8e681df197df/kube-rbac-proxy/0.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.441013 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-m2bfb_5e7b4134-2b34-4b36-ad61-8e681df197df/manager/1.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.527616 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-m2bfb_5e7b4134-2b34-4b36-ad61-8e681df197df/manager/0.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.623312 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-7xg4t_523c06cc-9816-4252-ac00-dc7928dae009/kube-rbac-proxy/0.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.682584 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-7xg4t_523c06cc-9816-4252-ac00-dc7928dae009/manager/1.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.735616 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-7xg4t_523c06cc-9816-4252-ac00-dc7928dae009/manager/0.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.779973 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9_1d86df9d-86a7-4980-abd0-488d98f6b2fb/kube-rbac-proxy/0.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.918578 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9_1d86df9d-86a7-4980-abd0-488d98f6b2fb/manager/1.log" Dec 03 12:44:50 crc kubenswrapper[4702]: I1203 12:44:50.964227 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4jtcw9_1d86df9d-86a7-4980-abd0-488d98f6b2fb/manager/0.log" Dec 03 12:44:51 crc kubenswrapper[4702]: I1203 12:44:51.221693 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c98f8bd-8mv9c_b877c7a7-0b88-4238-8a21-314ef1525996/manager/1.log" Dec 03 12:44:51 crc kubenswrapper[4702]: I1203 12:44:51.336649 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-75b4565ff4-4pl92_7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3/operator/1.log" Dec 03 12:44:51 crc kubenswrapper[4702]: I1203 12:44:51.589148 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-75b4565ff4-4pl92_7d79c3e6-cab8-4e77-84da-85ae2a1eb3b3/operator/0.log" Dec 03 12:44:51 crc kubenswrapper[4702]: I1203 12:44:51.641860 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ds4ss_9432a2a8-8932-4734-a69d-8976764f1dab/registry-server/1.log" Dec 03 12:44:51 crc kubenswrapper[4702]: I1203 12:44:51.663647 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ds4ss_9432a2a8-8932-4734-a69d-8976764f1dab/registry-server/0.log" Dec 03 12:44:51 crc kubenswrapper[4702]: I1203 12:44:51.925444 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-t27c4_1d60d4ab-7bac-4fd1-9aad-c07ba1513d41/kube-rbac-proxy/0.log" Dec 03 12:44:51 crc kubenswrapper[4702]: I1203 12:44:51.938039 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-t27c4_1d60d4ab-7bac-4fd1-9aad-c07ba1513d41/manager/1.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.033719 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-t27c4_1d60d4ab-7bac-4fd1-9aad-c07ba1513d41/manager/0.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.202227 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-ntzds_8de75640-5551-4d04-830d-64f0fbb7847a/kube-rbac-proxy/0.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.220291 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-ntzds_8de75640-5551-4d04-830d-64f0fbb7847a/manager/1.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.358659 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-ntzds_8de75640-5551-4d04-830d-64f0fbb7847a/manager/0.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.533807 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ckjgv_c43c86a0-692f-406f-871a-24a14f24ed77/operator/1.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.591973 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-ckjgv_c43c86a0-692f-406f-871a-24a14f24ed77/operator/0.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.714472 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c98f8bd-8mv9c_b877c7a7-0b88-4238-8a21-314ef1525996/manager/0.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.736869 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-psnhp_b6faaca6-f017-42ac-95e4-d73ae3e8e519/kube-rbac-proxy/0.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.766168 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-psnhp_b6faaca6-f017-42ac-95e4-d73ae3e8e519/manager/1.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.912832 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-psnhp_b6faaca6-f017-42ac-95e4-d73ae3e8e519/manager/0.log" Dec 03 12:44:52 crc kubenswrapper[4702]: I1203 12:44:52.928269 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:44:52 crc kubenswrapper[4702]: E1203 12:44:52.928618 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:44:53 crc kubenswrapper[4702]: I1203 12:44:53.016502 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f8bdcbf7f-4tp6n_5edf270b-74cb-42d2-82dc-7953f243c6dc/kube-rbac-proxy/0.log" Dec 03 12:44:53 crc kubenswrapper[4702]: I1203 12:44:53.044548 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f8bdcbf7f-4tp6n_5edf270b-74cb-42d2-82dc-7953f243c6dc/manager/1.log" Dec 03 12:44:53 crc kubenswrapper[4702]: I1203 12:44:53.242280 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-nj4tn_afc37ae6-c944-4cb1-81b6-c810ea1c3b31/manager/0.log" Dec 03 12:44:53 crc kubenswrapper[4702]: I1203 12:44:53.265506 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-nj4tn_afc37ae6-c944-4cb1-81b6-c810ea1c3b31/kube-rbac-proxy/0.log" Dec 03 12:44:53 crc kubenswrapper[4702]: I1203 12:44:53.277158 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-nj4tn_afc37ae6-c944-4cb1-81b6-c810ea1c3b31/manager/1.log" Dec 03 12:44:53 crc kubenswrapper[4702]: I1203 12:44:53.377821 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-f8bdcbf7f-4tp6n_5edf270b-74cb-42d2-82dc-7953f243c6dc/manager/0.log" Dec 03 12:44:54 crc kubenswrapper[4702]: I1203 12:44:54.447438 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xlpkq_ae6dac10-29ba-4bb8-8a0c-68a2bad519af/kube-rbac-proxy/0.log" Dec 03 12:44:54 crc kubenswrapper[4702]: I1203 12:44:54.498329 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xlpkq_ae6dac10-29ba-4bb8-8a0c-68a2bad519af/manager/1.log" Dec 03 12:44:54 crc kubenswrapper[4702]: I1203 12:44:54.566676 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-xlpkq_ae6dac10-29ba-4bb8-8a0c-68a2bad519af/manager/0.log" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.227595 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g"] Dec 03 12:45:00 crc kubenswrapper[4702]: E1203 12:45:00.228736 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0451f8b5-c0e1-443e-83be-1cae97c6592b" containerName="container-00" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.228776 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="0451f8b5-c0e1-443e-83be-1cae97c6592b" containerName="container-00" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.229070 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="0451f8b5-c0e1-443e-83be-1cae97c6592b" containerName="container-00" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.229965 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.236810 4702 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.236817 4702 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.243632 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g"] Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.369992 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-secret-volume\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.370064 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn9dw\" (UniqueName: \"kubernetes.io/projected/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-kube-api-access-wn9dw\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.370291 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-config-volume\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.473552 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-secret-volume\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.473620 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn9dw\" (UniqueName: \"kubernetes.io/projected/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-kube-api-access-wn9dw\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.473657 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-config-volume\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.474505 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-config-volume\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.482151 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-secret-volume\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.497205 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn9dw\" (UniqueName: \"kubernetes.io/projected/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-kube-api-access-wn9dw\") pod \"collect-profiles-29412765-nwl8g\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:00 crc kubenswrapper[4702]: I1203 12:45:00.556191 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:01 crc kubenswrapper[4702]: I1203 12:45:01.313126 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g"] Dec 03 12:45:02 crc kubenswrapper[4702]: I1203 12:45:02.158654 4702 generic.go:334] "Generic (PLEG): container finished" podID="e7b8a6a1-a660-4c51-9a57-80a11da3a98e" containerID="b53b0086cf3971ef2417c7d141eb8ca0e5f6ec24cd80109ff5d341c1994698e6" exitCode=0 Dec 03 12:45:02 crc kubenswrapper[4702]: I1203 12:45:02.159333 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" event={"ID":"e7b8a6a1-a660-4c51-9a57-80a11da3a98e","Type":"ContainerDied","Data":"b53b0086cf3971ef2417c7d141eb8ca0e5f6ec24cd80109ff5d341c1994698e6"} Dec 03 12:45:02 crc kubenswrapper[4702]: I1203 12:45:02.159375 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" event={"ID":"e7b8a6a1-a660-4c51-9a57-80a11da3a98e","Type":"ContainerStarted","Data":"885256dea2918201f5e3b5527e612792109b647051350038642452e0a279ea82"} Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.645960 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.779887 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-secret-volume\") pod \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.780209 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-config-volume\") pod \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.780276 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn9dw\" (UniqueName: \"kubernetes.io/projected/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-kube-api-access-wn9dw\") pod \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\" (UID: \"e7b8a6a1-a660-4c51-9a57-80a11da3a98e\") " Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.780669 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7b8a6a1-a660-4c51-9a57-80a11da3a98e" (UID: "e7b8a6a1-a660-4c51-9a57-80a11da3a98e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.781644 4702 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.788924 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7b8a6a1-a660-4c51-9a57-80a11da3a98e" (UID: "e7b8a6a1-a660-4c51-9a57-80a11da3a98e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.789014 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-kube-api-access-wn9dw" (OuterVolumeSpecName: "kube-api-access-wn9dw") pod "e7b8a6a1-a660-4c51-9a57-80a11da3a98e" (UID: "e7b8a6a1-a660-4c51-9a57-80a11da3a98e"). InnerVolumeSpecName "kube-api-access-wn9dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.883813 4702 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:03 crc kubenswrapper[4702]: I1203 12:45:03.883856 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn9dw\" (UniqueName: \"kubernetes.io/projected/e7b8a6a1-a660-4c51-9a57-80a11da3a98e-kube-api-access-wn9dw\") on node \"crc\" DevicePath \"\"" Dec 03 12:45:04 crc kubenswrapper[4702]: I1203 12:45:04.188635 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" event={"ID":"e7b8a6a1-a660-4c51-9a57-80a11da3a98e","Type":"ContainerDied","Data":"885256dea2918201f5e3b5527e612792109b647051350038642452e0a279ea82"} Dec 03 12:45:04 crc kubenswrapper[4702]: I1203 12:45:04.189065 4702 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="885256dea2918201f5e3b5527e612792109b647051350038642452e0a279ea82" Dec 03 12:45:04 crc kubenswrapper[4702]: I1203 12:45:04.188743 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412765-nwl8g" Dec 03 12:45:04 crc kubenswrapper[4702]: I1203 12:45:04.973912 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx"] Dec 03 12:45:04 crc kubenswrapper[4702]: I1203 12:45:04.988511 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412720-sbcgx"] Dec 03 12:45:07 crc kubenswrapper[4702]: I1203 12:45:07.281437 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df96ec35-4a73-475a-b3b6-7e08dbfddc4d" path="/var/lib/kubelet/pods/df96ec35-4a73-475a-b3b6-7e08dbfddc4d/volumes" Dec 03 12:45:07 crc kubenswrapper[4702]: I1203 12:45:07.928732 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:45:07 crc kubenswrapper[4702]: E1203 12:45:07.929377 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:45:19 crc kubenswrapper[4702]: I1203 12:45:19.543179 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-zthg4_0bce3a1f-cd2e-41b9-b768-322f3ce72ed9/control-plane-machine-set-operator/0.log" Dec 03 12:45:19 crc kubenswrapper[4702]: I1203 12:45:19.714396 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gzc7n_b8e73047-6376-4bd9-8ec8-5966f8786e5d/kube-rbac-proxy/0.log" Dec 03 12:45:19 crc kubenswrapper[4702]: I1203 12:45:19.782781 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gzc7n_b8e73047-6376-4bd9-8ec8-5966f8786e5d/machine-api-operator/0.log" Dec 03 12:45:22 crc kubenswrapper[4702]: I1203 12:45:22.929830 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:45:22 crc kubenswrapper[4702]: E1203 12:45:22.932194 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:45:32 crc kubenswrapper[4702]: I1203 12:45:32.735366 4702 scope.go:117] "RemoveContainer" containerID="7f02b036b8ef2abf40c6c4b52e1f52a4696891c937740d00ca0f2e23f7cfbb8a" Dec 03 12:45:35 crc kubenswrapper[4702]: I1203 12:45:35.671537 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-kt9ll_c3476e51-0ccc-41ee-8d43-4bf8b59a6bbe/cert-manager-controller/0.log" Dec 03 12:45:35 crc kubenswrapper[4702]: I1203 12:45:35.814721 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-f9zmn_d966c899-cb05-41ea-b10b-820da56925f6/cert-manager-cainjector/0.log" Dec 03 12:45:35 crc kubenswrapper[4702]: I1203 12:45:35.873986 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-l8p84_40ccf765-6eb2-49e3-8f2c-635b1981639e/cert-manager-webhook/0.log" Dec 03 12:45:37 crc kubenswrapper[4702]: I1203 12:45:37.928390 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:45:37 crc kubenswrapper[4702]: E1203 12:45:37.929090 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:45:48 crc kubenswrapper[4702]: I1203 12:45:48.929100 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:45:48 crc kubenswrapper[4702]: E1203 12:45:48.930085 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:45:53 crc kubenswrapper[4702]: I1203 12:45:53.084036 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-dmrtz_908a9238-0a36-40c1-a7c0-c0c0789f29ae/nmstate-console-plugin/0.log" Dec 03 12:45:53 crc kubenswrapper[4702]: I1203 12:45:53.278199 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-zxwqw_9ec2138c-eb31-401f-b62d-d2823fe0523f/nmstate-handler/0.log" Dec 03 12:45:53 crc kubenswrapper[4702]: I1203 12:45:53.323700 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6jgh5_cf40bd24-301e-4eb1-bbdb-84a55cd53cc9/kube-rbac-proxy/0.log" Dec 03 12:45:53 crc kubenswrapper[4702]: I1203 12:45:53.395334 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-6jgh5_cf40bd24-301e-4eb1-bbdb-84a55cd53cc9/nmstate-metrics/0.log" Dec 03 12:45:53 crc kubenswrapper[4702]: I1203 12:45:53.524992 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-pc794_d1a5b826-9b8e-4400-8e6e-06824af9bd4c/nmstate-operator/0.log" Dec 03 12:45:53 crc kubenswrapper[4702]: I1203 12:45:53.695060 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-r6jd6_9370f81f-7868-4a16-9cec-7786257cdcbd/nmstate-webhook/0.log" Dec 03 12:46:02 crc kubenswrapper[4702]: I1203 12:46:02.928117 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:46:02 crc kubenswrapper[4702]: E1203 12:46:02.928915 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:46:10 crc kubenswrapper[4702]: I1203 12:46:10.045032 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5688675f7c-q6w79_672e4a37-26c7-4378-a524-57fba88aec53/kube-rbac-proxy/0.log" Dec 03 12:46:10 crc kubenswrapper[4702]: I1203 12:46:10.053648 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5688675f7c-q6w79_672e4a37-26c7-4378-a524-57fba88aec53/manager/1.log" Dec 03 12:46:10 crc kubenswrapper[4702]: I1203 12:46:10.326672 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5688675f7c-q6w79_672e4a37-26c7-4378-a524-57fba88aec53/manager/0.log" Dec 03 12:46:16 crc kubenswrapper[4702]: I1203 12:46:16.937377 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:46:16 crc kubenswrapper[4702]: E1203 12:46:16.938368 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:46:26 crc kubenswrapper[4702]: I1203 12:46:26.193915 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-ff9846bd-g4d4w_7f8d827c-68d7-4bd7-9934-6cdcd1b7059e/cluster-logging-operator/0.log" Dec 03 12:46:26 crc kubenswrapper[4702]: I1203 12:46:26.445127 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-nlphm_cc184786-2931-4cb8-a185-b4f1fb2bcb40/collector/0.log" Dec 03 12:46:26 crc kubenswrapper[4702]: I1203 12:46:26.492715 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_358cd791-ccf7-4655-b446-b800598e773c/loki-compactor/0.log" Dec 03 12:46:26 crc kubenswrapper[4702]: I1203 12:46:26.698711 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-76dff8487c-68ktn_6cbbba51-9166-42cb-917c-7c634351e5c9/gateway/0.log" Dec 03 12:46:26 crc kubenswrapper[4702]: I1203 12:46:26.708349 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-76cc67bf56-xrnp2_508c1eef-dbbc-4c32-8d2e-dbb797c72461/loki-distributor/0.log" Dec 03 12:46:26 crc kubenswrapper[4702]: I1203 12:46:26.782009 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-76dff8487c-68ktn_6cbbba51-9166-42cb-917c-7c634351e5c9/opa/0.log" Dec 03 12:46:26 crc kubenswrapper[4702]: I1203 12:46:26.916040 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-76dff8487c-mdlcz_72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653/gateway/0.log" Dec 03 12:46:26 crc kubenswrapper[4702]: I1203 12:46:26.942049 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-76dff8487c-mdlcz_72b0cdbc-a8b4-4ab5-bbb6-0828c9afc653/opa/0.log" Dec 03 12:46:27 crc kubenswrapper[4702]: I1203 12:46:27.144004 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_07e2709a-6aac-4b21-8fc8-bfc21992aae3/loki-index-gateway/0.log" Dec 03 12:46:27 crc kubenswrapper[4702]: I1203 12:46:27.253262 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_6ce47478-68cc-46a9-99c3-cb20947e63c5/loki-ingester/0.log" Dec 03 12:46:27 crc kubenswrapper[4702]: I1203 12:46:27.404680 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-5895d59bb8-bhqrp_042cc406-7960-493a-a19a-cb5590f8ff1f/loki-querier/0.log" Dec 03 12:46:27 crc kubenswrapper[4702]: I1203 12:46:27.474381 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-84558f7c9f-dflgw_dbea18cb-2e45-4d86-bc00-17a82f0a78ff/loki-query-frontend/0.log" Dec 03 12:46:31 crc kubenswrapper[4702]: I1203 12:46:31.928811 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:46:31 crc kubenswrapper[4702]: E1203 12:46:31.929695 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:46:43 crc kubenswrapper[4702]: I1203 12:46:43.222742 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-2npsf_d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c/controller/1.log" Dec 03 12:46:43 crc kubenswrapper[4702]: I1203 12:46:43.401226 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-2npsf_d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c/controller/0.log" Dec 03 12:46:43 crc kubenswrapper[4702]: I1203 12:46:43.500647 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-2npsf_d6c1b66f-fa02-4889-a7fe-f7fe0d467c7c/kube-rbac-proxy/0.log" Dec 03 12:46:43 crc kubenswrapper[4702]: I1203 12:46:43.580442 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-slqp5_b3a5cd30-f098-4e9c-bbb0-f45305893017/frr-k8s-webhook-server/0.log" Dec 03 12:46:43 crc kubenswrapper[4702]: I1203 12:46:43.739024 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-frr-files/0.log" Dec 03 12:46:43 crc kubenswrapper[4702]: I1203 12:46:43.930900 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-frr-files/0.log" Dec 03 12:46:43 crc kubenswrapper[4702]: I1203 12:46:43.933236 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-reloader/0.log" Dec 03 12:46:43 crc kubenswrapper[4702]: I1203 12:46:43.998748 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-metrics/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.037451 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-reloader/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.235418 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-frr-files/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.265533 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-reloader/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.323966 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-metrics/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.324183 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-metrics/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.535020 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-reloader/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.552723 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-frr-files/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.559103 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/controller/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.580213 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/cp-metrics/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.778744 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/frr/1.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.812201 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/kube-rbac-proxy/0.log" Dec 03 12:46:44 crc kubenswrapper[4702]: I1203 12:46:44.815464 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/frr-metrics/0.log" Dec 03 12:46:45 crc kubenswrapper[4702]: I1203 12:46:45.108261 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/kube-rbac-proxy-frr/0.log" Dec 03 12:46:45 crc kubenswrapper[4702]: I1203 12:46:45.129071 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/reloader/0.log" Dec 03 12:46:45 crc kubenswrapper[4702]: I1203 12:46:45.403535 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85d7874b49-jvs5t_6e99cffd-b82e-46c9-8cbd-fe8c24507385/manager/0.log" Dec 03 12:46:45 crc kubenswrapper[4702]: I1203 12:46:45.440401 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-85d7874b49-jvs5t_6e99cffd-b82e-46c9-8cbd-fe8c24507385/manager/1.log" Dec 03 12:46:45 crc kubenswrapper[4702]: I1203 12:46:45.723303 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66c548d864-tr7qq_2cb93136-1d69-4bc8-9c42-aee1f6638aa6/webhook-server/1.log" Dec 03 12:46:45 crc kubenswrapper[4702]: I1203 12:46:45.728911 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-66c548d864-tr7qq_2cb93136-1d69-4bc8-9c42-aee1f6638aa6/webhook-server/0.log" Dec 03 12:46:45 crc kubenswrapper[4702]: I1203 12:46:45.931688 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:46:45 crc kubenswrapper[4702]: E1203 12:46:45.933859 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:46:45 crc kubenswrapper[4702]: I1203 12:46:45.984164 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bv8pf_4be204bf-b480-4d77-9ced-34c6668afa14/kube-rbac-proxy/0.log" Dec 03 12:46:46 crc kubenswrapper[4702]: I1203 12:46:46.310006 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bv8pf_4be204bf-b480-4d77-9ced-34c6668afa14/speaker/1.log" Dec 03 12:46:46 crc kubenswrapper[4702]: I1203 12:46:46.857975 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bv8pf_4be204bf-b480-4d77-9ced-34c6668afa14/speaker/0.log" Dec 03 12:46:47 crc kubenswrapper[4702]: I1203 12:46:47.093684 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-wh75l_7643d370-6497-4a94-b0e7-2db66b56b687/frr/0.log" Dec 03 12:46:56 crc kubenswrapper[4702]: I1203 12:46:56.939713 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:46:56 crc kubenswrapper[4702]: E1203 12:46:56.940593 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:47:03 crc kubenswrapper[4702]: I1203 12:47:03.301349 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk_aabceec8-9509-4d66-af3e-1b9d9a270b38/util/0.log" Dec 03 12:47:03 crc kubenswrapper[4702]: I1203 12:47:03.503646 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk_aabceec8-9509-4d66-af3e-1b9d9a270b38/pull/0.log" Dec 03 12:47:03 crc kubenswrapper[4702]: I1203 12:47:03.507488 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk_aabceec8-9509-4d66-af3e-1b9d9a270b38/util/0.log" Dec 03 12:47:03 crc kubenswrapper[4702]: I1203 12:47:03.614785 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk_aabceec8-9509-4d66-af3e-1b9d9a270b38/pull/0.log" Dec 03 12:47:04 crc kubenswrapper[4702]: I1203 12:47:04.457694 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk_aabceec8-9509-4d66-af3e-1b9d9a270b38/util/0.log" Dec 03 12:47:04 crc kubenswrapper[4702]: I1203 12:47:04.477011 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk_aabceec8-9509-4d66-af3e-1b9d9a270b38/extract/0.log" Dec 03 12:47:04 crc kubenswrapper[4702]: I1203 12:47:04.488051 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4529ed37fc81381df2b45ea09e6f1b4af8d1558d603912431befd8aeb8zd9wk_aabceec8-9509-4d66-af3e-1b9d9a270b38/pull/0.log" Dec 03 12:47:04 crc kubenswrapper[4702]: I1203 12:47:04.663914 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d_2a038246-a8c1-4a5d-8ef4-250eaf126ace/util/0.log" Dec 03 12:47:04 crc kubenswrapper[4702]: I1203 12:47:04.873497 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d_2a038246-a8c1-4a5d-8ef4-250eaf126ace/util/0.log" Dec 03 12:47:04 crc kubenswrapper[4702]: I1203 12:47:04.874008 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d_2a038246-a8c1-4a5d-8ef4-250eaf126ace/pull/0.log" Dec 03 12:47:04 crc kubenswrapper[4702]: I1203 12:47:04.900929 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d_2a038246-a8c1-4a5d-8ef4-250eaf126ace/pull/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.075261 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d_2a038246-a8c1-4a5d-8ef4-250eaf126ace/extract/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.079802 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d_2a038246-a8c1-4a5d-8ef4-250eaf126ace/util/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.109919 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f8tg2d_2a038246-a8c1-4a5d-8ef4-250eaf126ace/pull/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.305302 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9_d3aa75a4-714d-43ca-9a0f-82bd64ae31cd/util/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.473285 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9_d3aa75a4-714d-43ca-9a0f-82bd64ae31cd/util/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.501329 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9_d3aa75a4-714d-43ca-9a0f-82bd64ae31cd/pull/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.531655 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9_d3aa75a4-714d-43ca-9a0f-82bd64ae31cd/pull/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.681184 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9_d3aa75a4-714d-43ca-9a0f-82bd64ae31cd/util/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.701914 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9_d3aa75a4-714d-43ca-9a0f-82bd64ae31cd/pull/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.714576 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kqzx9_d3aa75a4-714d-43ca-9a0f-82bd64ae31cd/extract/0.log" Dec 03 12:47:05 crc kubenswrapper[4702]: I1203 12:47:05.855780 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx_be1348c4-10f9-4f68-9ade-51ff820cd05a/util/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.057419 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx_be1348c4-10f9-4f68-9ade-51ff820cd05a/util/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.524646 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx_be1348c4-10f9-4f68-9ade-51ff820cd05a/extract/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.526194 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx_be1348c4-10f9-4f68-9ade-51ff820cd05a/pull/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.526246 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx_be1348c4-10f9-4f68-9ade-51ff820cd05a/pull/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.526607 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx_be1348c4-10f9-4f68-9ade-51ff820cd05a/pull/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.775906 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a8a03f72555e3294619fd3c0a789fa82d1f6921a8cf9935ed9b211463fsbdxx_be1348c4-10f9-4f68-9ade-51ff820cd05a/util/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.785958 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj_3638051c-7f0b-4e32-81c4-e6e327fc5a8b/util/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.914471 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj_3638051c-7f0b-4e32-81c4-e6e327fc5a8b/util/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.943585 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj_3638051c-7f0b-4e32-81c4-e6e327fc5a8b/pull/0.log" Dec 03 12:47:06 crc kubenswrapper[4702]: I1203 12:47:06.953931 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj_3638051c-7f0b-4e32-81c4-e6e327fc5a8b/pull/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.117577 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj_3638051c-7f0b-4e32-81c4-e6e327fc5a8b/util/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.162078 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj_3638051c-7f0b-4e32-81c4-e6e327fc5a8b/extract/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.163259 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83nlczj_3638051c-7f0b-4e32-81c4-e6e327fc5a8b/pull/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.231208 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ntqc_38c7c63b-db59-4055-aee0-99ea082bd8f7/extract-utilities/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.388459 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ntqc_38c7c63b-db59-4055-aee0-99ea082bd8f7/extract-utilities/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.396704 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ntqc_38c7c63b-db59-4055-aee0-99ea082bd8f7/extract-content/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.415988 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ntqc_38c7c63b-db59-4055-aee0-99ea082bd8f7/extract-content/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.627052 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ntqc_38c7c63b-db59-4055-aee0-99ea082bd8f7/extract-content/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.635645 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ntqc_38c7c63b-db59-4055-aee0-99ea082bd8f7/extract-utilities/0.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.709953 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ntqc_38c7c63b-db59-4055-aee0-99ea082bd8f7/registry-server/1.log" Dec 03 12:47:07 crc kubenswrapper[4702]: I1203 12:47:07.852406 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6cqt_0f2dd872-6ac4-4527-9a91-218b1de5ed5e/extract-utilities/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.069362 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6cqt_0f2dd872-6ac4-4527-9a91-218b1de5ed5e/extract-utilities/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.087239 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6cqt_0f2dd872-6ac4-4527-9a91-218b1de5ed5e/extract-content/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.143870 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6cqt_0f2dd872-6ac4-4527-9a91-218b1de5ed5e/extract-content/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.271659 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-7ntqc_38c7c63b-db59-4055-aee0-99ea082bd8f7/registry-server/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.308894 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6cqt_0f2dd872-6ac4-4527-9a91-218b1de5ed5e/extract-content/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.343259 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6cqt_0f2dd872-6ac4-4527-9a91-218b1de5ed5e/extract-utilities/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.519023 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-79n82_f2c1609d-33a3-444f-9370-24495b15b3e0/marketplace-operator/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.751270 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6sd9l_ea8c3262-d494-4427-8228-df9584c00ca1/extract-utilities/0.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.818989 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6cqt_0f2dd872-6ac4-4527-9a91-218b1de5ed5e/registry-server/1.log" Dec 03 12:47:08 crc kubenswrapper[4702]: I1203 12:47:08.979611 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6sd9l_ea8c3262-d494-4427-8228-df9584c00ca1/extract-utilities/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.032552 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6sd9l_ea8c3262-d494-4427-8228-df9584c00ca1/extract-content/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.049623 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6sd9l_ea8c3262-d494-4427-8228-df9584c00ca1/extract-content/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.218897 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6sd9l_ea8c3262-d494-4427-8228-df9584c00ca1/extract-content/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.227702 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6sd9l_ea8c3262-d494-4427-8228-df9584c00ca1/extract-utilities/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.401441 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhss9_480aa817-7d43-4ea8-9099-06bcb431e578/extract-utilities/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.586999 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhss9_480aa817-7d43-4ea8-9099-06bcb431e578/extract-utilities/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.738694 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhss9_480aa817-7d43-4ea8-9099-06bcb431e578/extract-content/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.883234 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhss9_480aa817-7d43-4ea8-9099-06bcb431e578/extract-utilities/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.895119 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhss9_480aa817-7d43-4ea8-9099-06bcb431e578/extract-content/0.log" Dec 03 12:47:09 crc kubenswrapper[4702]: I1203 12:47:09.926446 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6sd9l_ea8c3262-d494-4427-8228-df9584c00ca1/registry-server/1.log" Dec 03 12:47:10 crc kubenswrapper[4702]: I1203 12:47:10.071090 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhss9_480aa817-7d43-4ea8-9099-06bcb431e578/extract-content/0.log" Dec 03 12:47:10 crc kubenswrapper[4702]: I1203 12:47:10.346091 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhss9_480aa817-7d43-4ea8-9099-06bcb431e578/registry-server/1.log" Dec 03 12:47:10 crc kubenswrapper[4702]: I1203 12:47:10.355228 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6sd9l_ea8c3262-d494-4427-8228-df9584c00ca1/registry-server/0.log" Dec 03 12:47:11 crc kubenswrapper[4702]: I1203 12:47:11.514484 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-m6cqt_0f2dd872-6ac4-4527-9a91-218b1de5ed5e/registry-server/0.log" Dec 03 12:47:11 crc kubenswrapper[4702]: I1203 12:47:11.596613 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qhss9_480aa817-7d43-4ea8-9099-06bcb431e578/registry-server/0.log" Dec 03 12:47:11 crc kubenswrapper[4702]: I1203 12:47:11.929156 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:47:11 crc kubenswrapper[4702]: E1203 12:47:11.929537 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:47:25 crc kubenswrapper[4702]: I1203 12:47:25.864139 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-h9rvk_db361c90-107c-4510-9683-659b755ebc42/prometheus-operator/0.log" Dec 03 12:47:25 crc kubenswrapper[4702]: I1203 12:47:25.928889 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:47:25 crc kubenswrapper[4702]: E1203 12:47:25.929234 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:47:26 crc kubenswrapper[4702]: I1203 12:47:26.111320 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8464f97cb5-ccgbd_a45f4fed-ae1f-4d04-8394-8208bbe31b44/prometheus-operator-admission-webhook/0.log" Dec 03 12:47:26 crc kubenswrapper[4702]: I1203 12:47:26.174306 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-8464f97cb5-njc5k_5e5ed31e-a6b7-494f-ae2e-8c824e717092/prometheus-operator-admission-webhook/0.log" Dec 03 12:47:26 crc kubenswrapper[4702]: I1203 12:47:26.359078 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-55ql9_89d80ae9-23a4-4c91-a04d-7343d8a4df05/operator/0.log" Dec 03 12:47:26 crc kubenswrapper[4702]: I1203 12:47:26.443319 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7d5fb4cbfb-7x6j5_4201f396-8ff3-4b7b-82d2-f26cc129b3f9/observability-ui-dashboards/0.log" Dec 03 12:47:26 crc kubenswrapper[4702]: I1203 12:47:26.599296 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-vpr8z_ec0726c3-58ef-4a22-8e00-bae32d7d66ca/perses-operator/1.log" Dec 03 12:47:26 crc kubenswrapper[4702]: I1203 12:47:26.652494 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-vpr8z_ec0726c3-58ef-4a22-8e00-bae32d7d66ca/perses-operator/0.log" Dec 03 12:47:39 crc kubenswrapper[4702]: I1203 12:47:39.929121 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:47:39 crc kubenswrapper[4702]: E1203 12:47:39.931437 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:47:40 crc kubenswrapper[4702]: I1203 12:47:40.418695 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5688675f7c-q6w79_672e4a37-26c7-4378-a524-57fba88aec53/manager/0.log" Dec 03 12:47:40 crc kubenswrapper[4702]: I1203 12:47:40.426563 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5688675f7c-q6w79_672e4a37-26c7-4378-a524-57fba88aec53/kube-rbac-proxy/0.log" Dec 03 12:47:40 crc kubenswrapper[4702]: I1203 12:47:40.439591 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-5688675f7c-q6w79_672e4a37-26c7-4378-a524-57fba88aec53/manager/1.log" Dec 03 12:47:54 crc kubenswrapper[4702]: I1203 12:47:54.929658 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:47:54 crc kubenswrapper[4702]: E1203 12:47:54.930378 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:48:06 crc kubenswrapper[4702]: I1203 12:48:06.937741 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:48:06 crc kubenswrapper[4702]: E1203 12:48:06.938667 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:48:18 crc kubenswrapper[4702]: I1203 12:48:18.928860 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:48:18 crc kubenswrapper[4702]: E1203 12:48:18.929807 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:48:30 crc kubenswrapper[4702]: I1203 12:48:30.927952 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:48:30 crc kubenswrapper[4702]: E1203 12:48:30.930441 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:48:45 crc kubenswrapper[4702]: I1203 12:48:45.929038 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:48:45 crc kubenswrapper[4702]: E1203 12:48:45.929944 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:48:56 crc kubenswrapper[4702]: I1203 12:48:56.940815 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:48:56 crc kubenswrapper[4702]: E1203 12:48:56.941627 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:49:07 crc kubenswrapper[4702]: I1203 12:49:07.042583 4702 trace.go:236] Trace[650140396]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-index-gateway-0" (03-Dec-2025 12:49:05.889) (total time: 1152ms): Dec 03 12:49:07 crc kubenswrapper[4702]: Trace[650140396]: [1.152882206s] [1.152882206s] END Dec 03 12:49:11 crc kubenswrapper[4702]: I1203 12:49:11.928797 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:49:11 crc kubenswrapper[4702]: E1203 12:49:11.929619 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:49:22 crc kubenswrapper[4702]: I1203 12:49:22.932106 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:49:22 crc kubenswrapper[4702]: E1203 12:49:22.933089 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:49:33 crc kubenswrapper[4702]: I1203 12:49:33.928336 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:49:35 crc kubenswrapper[4702]: I1203 12:49:35.216702 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"d55ed227ca79a527f938cb9f1414810c0b9a98eb3aa70ee1eb863c1b8617d357"} Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.712733 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-klrtq"] Dec 03 12:49:45 crc kubenswrapper[4702]: E1203 12:49:45.714112 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7b8a6a1-a660-4c51-9a57-80a11da3a98e" containerName="collect-profiles" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.714128 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7b8a6a1-a660-4c51-9a57-80a11da3a98e" containerName="collect-profiles" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.714373 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7b8a6a1-a660-4c51-9a57-80a11da3a98e" containerName="collect-profiles" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.717258 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.733406 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klrtq"] Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.863951 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qcll\" (UniqueName: \"kubernetes.io/projected/55eea2d6-2cbd-41bf-9041-18b99c88a795-kube-api-access-6qcll\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.864031 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-catalog-content\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.864180 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-utilities\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.966840 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-utilities\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.966980 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qcll\" (UniqueName: \"kubernetes.io/projected/55eea2d6-2cbd-41bf-9041-18b99c88a795-kube-api-access-6qcll\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.967065 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-catalog-content\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.968541 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-utilities\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.969125 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-catalog-content\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:45 crc kubenswrapper[4702]: I1203 12:49:45.992474 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qcll\" (UniqueName: \"kubernetes.io/projected/55eea2d6-2cbd-41bf-9041-18b99c88a795-kube-api-access-6qcll\") pod \"community-operators-klrtq\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:46 crc kubenswrapper[4702]: I1203 12:49:46.046513 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:46 crc kubenswrapper[4702]: W1203 12:49:46.826133 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55eea2d6_2cbd_41bf_9041_18b99c88a795.slice/crio-cdf1ab54c375dabf50125e8d01474e41c7a29282549e38f0581eda2daba01de2 WatchSource:0}: Error finding container cdf1ab54c375dabf50125e8d01474e41c7a29282549e38f0581eda2daba01de2: Status 404 returned error can't find the container with id cdf1ab54c375dabf50125e8d01474e41c7a29282549e38f0581eda2daba01de2 Dec 03 12:49:46 crc kubenswrapper[4702]: I1203 12:49:46.833083 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klrtq"] Dec 03 12:49:47 crc kubenswrapper[4702]: I1203 12:49:47.365746 4702 generic.go:334] "Generic (PLEG): container finished" podID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerID="e4bb824e515fff30a72c61506fb569a29c146efcf9592e8502ec863c2a2ea29b" exitCode=0 Dec 03 12:49:47 crc kubenswrapper[4702]: I1203 12:49:47.365798 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klrtq" event={"ID":"55eea2d6-2cbd-41bf-9041-18b99c88a795","Type":"ContainerDied","Data":"e4bb824e515fff30a72c61506fb569a29c146efcf9592e8502ec863c2a2ea29b"} Dec 03 12:49:47 crc kubenswrapper[4702]: I1203 12:49:47.366055 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klrtq" event={"ID":"55eea2d6-2cbd-41bf-9041-18b99c88a795","Type":"ContainerStarted","Data":"cdf1ab54c375dabf50125e8d01474e41c7a29282549e38f0581eda2daba01de2"} Dec 03 12:49:47 crc kubenswrapper[4702]: I1203 12:49:47.370442 4702 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 12:49:50 crc kubenswrapper[4702]: I1203 12:49:50.408159 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klrtq" event={"ID":"55eea2d6-2cbd-41bf-9041-18b99c88a795","Type":"ContainerStarted","Data":"aaecd1e0adf29f586d7d0c6607fd63b20ad603b234f7db9fb68d5ae12ac16770"} Dec 03 12:49:51 crc kubenswrapper[4702]: I1203 12:49:51.422403 4702 generic.go:334] "Generic (PLEG): container finished" podID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerID="aaecd1e0adf29f586d7d0c6607fd63b20ad603b234f7db9fb68d5ae12ac16770" exitCode=0 Dec 03 12:49:51 crc kubenswrapper[4702]: I1203 12:49:51.422517 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klrtq" event={"ID":"55eea2d6-2cbd-41bf-9041-18b99c88a795","Type":"ContainerDied","Data":"aaecd1e0adf29f586d7d0c6607fd63b20ad603b234f7db9fb68d5ae12ac16770"} Dec 03 12:49:52 crc kubenswrapper[4702]: I1203 12:49:52.439449 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klrtq" event={"ID":"55eea2d6-2cbd-41bf-9041-18b99c88a795","Type":"ContainerStarted","Data":"680cf32965bd4729ad90f4b323ab40db5e20b2a5745e139664ab9619e7010136"} Dec 03 12:49:52 crc kubenswrapper[4702]: I1203 12:49:52.904167 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-klrtq" podStartSLOduration=3.309528295 podStartE2EDuration="7.904143237s" podCreationTimestamp="2025-12-03 12:49:45 +0000 UTC" firstStartedPulling="2025-12-03 12:49:47.369889879 +0000 UTC m=+6371.205818343" lastFinishedPulling="2025-12-03 12:49:51.964504811 +0000 UTC m=+6375.800433285" observedRunningTime="2025-12-03 12:49:52.474481916 +0000 UTC m=+6376.310410400" watchObservedRunningTime="2025-12-03 12:49:52.904143237 +0000 UTC m=+6376.740071701" Dec 03 12:49:55 crc kubenswrapper[4702]: I1203 12:49:55.485537 4702 generic.go:334] "Generic (PLEG): container finished" podID="2a8ef37d-7f53-449e-b954-d1624312e255" containerID="0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e" exitCode=0 Dec 03 12:49:55 crc kubenswrapper[4702]: I1203 12:49:55.485891 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pbxps/must-gather-64pl7" event={"ID":"2a8ef37d-7f53-449e-b954-d1624312e255","Type":"ContainerDied","Data":"0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e"} Dec 03 12:49:55 crc kubenswrapper[4702]: I1203 12:49:55.488047 4702 scope.go:117] "RemoveContainer" containerID="0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e" Dec 03 12:49:55 crc kubenswrapper[4702]: I1203 12:49:55.646125 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pbxps_must-gather-64pl7_2a8ef37d-7f53-449e-b954-d1624312e255/gather/0.log" Dec 03 12:49:56 crc kubenswrapper[4702]: I1203 12:49:56.046923 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:56 crc kubenswrapper[4702]: I1203 12:49:56.046999 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:49:57 crc kubenswrapper[4702]: I1203 12:49:57.108957 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-klrtq" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="registry-server" probeResult="failure" output=< Dec 03 12:49:57 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:49:57 crc kubenswrapper[4702]: > Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.164428 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pbxps/must-gather-64pl7"] Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.165391 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pbxps/must-gather-64pl7" podUID="2a8ef37d-7f53-449e-b954-d1624312e255" containerName="copy" containerID="cri-o://6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4" gracePeriod=2 Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.178336 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pbxps/must-gather-64pl7"] Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.774724 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pbxps_must-gather-64pl7_2a8ef37d-7f53-449e-b954-d1624312e255/copy/0.log" Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.776189 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.806609 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlf79\" (UniqueName: \"kubernetes.io/projected/2a8ef37d-7f53-449e-b954-d1624312e255-kube-api-access-qlf79\") pod \"2a8ef37d-7f53-449e-b954-d1624312e255\" (UID: \"2a8ef37d-7f53-449e-b954-d1624312e255\") " Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.807013 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a8ef37d-7f53-449e-b954-d1624312e255-must-gather-output\") pod \"2a8ef37d-7f53-449e-b954-d1624312e255\" (UID: \"2a8ef37d-7f53-449e-b954-d1624312e255\") " Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.817288 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8ef37d-7f53-449e-b954-d1624312e255-kube-api-access-qlf79" (OuterVolumeSpecName: "kube-api-access-qlf79") pod "2a8ef37d-7f53-449e-b954-d1624312e255" (UID: "2a8ef37d-7f53-449e-b954-d1624312e255"). InnerVolumeSpecName "kube-api-access-qlf79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.913147 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlf79\" (UniqueName: \"kubernetes.io/projected/2a8ef37d-7f53-449e-b954-d1624312e255-kube-api-access-qlf79\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.913788 4702 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pbxps_must-gather-64pl7_2a8ef37d-7f53-449e-b954-d1624312e255/copy/0.log" Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.916373 4702 generic.go:334] "Generic (PLEG): container finished" podID="2a8ef37d-7f53-449e-b954-d1624312e255" containerID="6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4" exitCode=143 Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.916484 4702 scope.go:117] "RemoveContainer" containerID="6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4" Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.916672 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pbxps/must-gather-64pl7" Dec 03 12:50:04 crc kubenswrapper[4702]: I1203 12:50:04.966177 4702 scope.go:117] "RemoveContainer" containerID="0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e" Dec 03 12:50:05 crc kubenswrapper[4702]: I1203 12:50:05.028343 4702 scope.go:117] "RemoveContainer" containerID="6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4" Dec 03 12:50:05 crc kubenswrapper[4702]: E1203 12:50:05.029014 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4\": container with ID starting with 6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4 not found: ID does not exist" containerID="6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4" Dec 03 12:50:05 crc kubenswrapper[4702]: I1203 12:50:05.029092 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4"} err="failed to get container status \"6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4\": rpc error: code = NotFound desc = could not find container \"6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4\": container with ID starting with 6bf5de5bebca6c8c4d22dfdd5777a3f0bae0d5d30ccd36c25a8931b9aefb5fb4 not found: ID does not exist" Dec 03 12:50:05 crc kubenswrapper[4702]: I1203 12:50:05.029144 4702 scope.go:117] "RemoveContainer" containerID="0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e" Dec 03 12:50:05 crc kubenswrapper[4702]: E1203 12:50:05.029599 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e\": container with ID starting with 0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e not found: ID does not exist" containerID="0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e" Dec 03 12:50:05 crc kubenswrapper[4702]: I1203 12:50:05.029652 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e"} err="failed to get container status \"0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e\": rpc error: code = NotFound desc = could not find container \"0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e\": container with ID starting with 0e300d9dcd1bbbf006690d16340a699de545d6ccc49da48aafa7aecce3daa88e not found: ID does not exist" Dec 03 12:50:05 crc kubenswrapper[4702]: I1203 12:50:05.056047 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a8ef37d-7f53-449e-b954-d1624312e255-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2a8ef37d-7f53-449e-b954-d1624312e255" (UID: "2a8ef37d-7f53-449e-b954-d1624312e255"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:05 crc kubenswrapper[4702]: I1203 12:50:05.123262 4702 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2a8ef37d-7f53-449e-b954-d1624312e255-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:06 crc kubenswrapper[4702]: I1203 12:50:06.143012 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:50:06 crc kubenswrapper[4702]: I1203 12:50:06.213513 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:50:06 crc kubenswrapper[4702]: I1203 12:50:06.406335 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-klrtq"] Dec 03 12:50:06 crc kubenswrapper[4702]: I1203 12:50:06.946746 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a8ef37d-7f53-449e-b954-d1624312e255" path="/var/lib/kubelet/pods/2a8ef37d-7f53-449e-b954-d1624312e255/volumes" Dec 03 12:50:07 crc kubenswrapper[4702]: I1203 12:50:07.959230 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-klrtq" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="registry-server" containerID="cri-o://680cf32965bd4729ad90f4b323ab40db5e20b2a5745e139664ab9619e7010136" gracePeriod=2 Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.005537 4702 generic.go:334] "Generic (PLEG): container finished" podID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerID="680cf32965bd4729ad90f4b323ab40db5e20b2a5745e139664ab9619e7010136" exitCode=0 Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.005642 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klrtq" event={"ID":"55eea2d6-2cbd-41bf-9041-18b99c88a795","Type":"ContainerDied","Data":"680cf32965bd4729ad90f4b323ab40db5e20b2a5745e139664ab9619e7010136"} Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.124916 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.131099 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-catalog-content\") pod \"55eea2d6-2cbd-41bf-9041-18b99c88a795\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.131221 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-utilities\") pod \"55eea2d6-2cbd-41bf-9041-18b99c88a795\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.131268 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qcll\" (UniqueName: \"kubernetes.io/projected/55eea2d6-2cbd-41bf-9041-18b99c88a795-kube-api-access-6qcll\") pod \"55eea2d6-2cbd-41bf-9041-18b99c88a795\" (UID: \"55eea2d6-2cbd-41bf-9041-18b99c88a795\") " Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.132733 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-utilities" (OuterVolumeSpecName: "utilities") pod "55eea2d6-2cbd-41bf-9041-18b99c88a795" (UID: "55eea2d6-2cbd-41bf-9041-18b99c88a795"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.139198 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55eea2d6-2cbd-41bf-9041-18b99c88a795-kube-api-access-6qcll" (OuterVolumeSpecName: "kube-api-access-6qcll") pod "55eea2d6-2cbd-41bf-9041-18b99c88a795" (UID: "55eea2d6-2cbd-41bf-9041-18b99c88a795"). InnerVolumeSpecName "kube-api-access-6qcll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.233053 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55eea2d6-2cbd-41bf-9041-18b99c88a795" (UID: "55eea2d6-2cbd-41bf-9041-18b99c88a795"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.273013 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.273060 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55eea2d6-2cbd-41bf-9041-18b99c88a795-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:09 crc kubenswrapper[4702]: I1203 12:50:09.273077 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qcll\" (UniqueName: \"kubernetes.io/projected/55eea2d6-2cbd-41bf-9041-18b99c88a795-kube-api-access-6qcll\") on node \"crc\" DevicePath \"\"" Dec 03 12:50:10 crc kubenswrapper[4702]: I1203 12:50:10.023170 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klrtq" event={"ID":"55eea2d6-2cbd-41bf-9041-18b99c88a795","Type":"ContainerDied","Data":"cdf1ab54c375dabf50125e8d01474e41c7a29282549e38f0581eda2daba01de2"} Dec 03 12:50:10 crc kubenswrapper[4702]: I1203 12:50:10.023467 4702 scope.go:117] "RemoveContainer" containerID="680cf32965bd4729ad90f4b323ab40db5e20b2a5745e139664ab9619e7010136" Dec 03 12:50:10 crc kubenswrapper[4702]: I1203 12:50:10.023409 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klrtq" Dec 03 12:50:10 crc kubenswrapper[4702]: I1203 12:50:10.058509 4702 scope.go:117] "RemoveContainer" containerID="aaecd1e0adf29f586d7d0c6607fd63b20ad603b234f7db9fb68d5ae12ac16770" Dec 03 12:50:10 crc kubenswrapper[4702]: I1203 12:50:10.077071 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-klrtq"] Dec 03 12:50:10 crc kubenswrapper[4702]: I1203 12:50:10.090443 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-klrtq"] Dec 03 12:50:10 crc kubenswrapper[4702]: I1203 12:50:10.101898 4702 scope.go:117] "RemoveContainer" containerID="e4bb824e515fff30a72c61506fb569a29c146efcf9592e8502ec863c2a2ea29b" Dec 03 12:50:10 crc kubenswrapper[4702]: I1203 12:50:10.943681 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" path="/var/lib/kubelet/pods/55eea2d6-2cbd-41bf-9041-18b99c88a795/volumes" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.418071 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dpm7h"] Dec 03 12:50:40 crc kubenswrapper[4702]: E1203 12:50:40.419260 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8ef37d-7f53-449e-b954-d1624312e255" containerName="copy" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.419275 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8ef37d-7f53-449e-b954-d1624312e255" containerName="copy" Dec 03 12:50:40 crc kubenswrapper[4702]: E1203 12:50:40.419311 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="extract-utilities" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.419318 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="extract-utilities" Dec 03 12:50:40 crc kubenswrapper[4702]: E1203 12:50:40.419356 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="extract-content" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.419363 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="extract-content" Dec 03 12:50:40 crc kubenswrapper[4702]: E1203 12:50:40.419385 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8ef37d-7f53-449e-b954-d1624312e255" containerName="gather" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.419391 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8ef37d-7f53-449e-b954-d1624312e255" containerName="gather" Dec 03 12:50:40 crc kubenswrapper[4702]: E1203 12:50:40.419403 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="registry-server" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.419408 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="registry-server" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.419647 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="55eea2d6-2cbd-41bf-9041-18b99c88a795" containerName="registry-server" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.419670 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8ef37d-7f53-449e-b954-d1624312e255" containerName="copy" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.419696 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8ef37d-7f53-449e-b954-d1624312e255" containerName="gather" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.421897 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.435176 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dpm7h"] Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.442221 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-utilities\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.442311 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-catalog-content\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.442551 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxfsf\" (UniqueName: \"kubernetes.io/projected/98289ec5-f103-435e-90c2-af492a39fa54-kube-api-access-qxfsf\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.546129 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-utilities\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.546233 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-catalog-content\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.546574 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxfsf\" (UniqueName: \"kubernetes.io/projected/98289ec5-f103-435e-90c2-af492a39fa54-kube-api-access-qxfsf\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.547291 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-utilities\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.547531 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-catalog-content\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.578928 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxfsf\" (UniqueName: \"kubernetes.io/projected/98289ec5-f103-435e-90c2-af492a39fa54-kube-api-access-qxfsf\") pod \"redhat-operators-dpm7h\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:40 crc kubenswrapper[4702]: I1203 12:50:40.765432 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:50:41 crc kubenswrapper[4702]: I1203 12:50:41.366615 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dpm7h"] Dec 03 12:50:41 crc kubenswrapper[4702]: I1203 12:50:41.459449 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpm7h" event={"ID":"98289ec5-f103-435e-90c2-af492a39fa54","Type":"ContainerStarted","Data":"f05b4f6f5de6f0dd1ac9d31a52b21c652d2f441673e5c527e2b686dede3f7c46"} Dec 03 12:50:43 crc kubenswrapper[4702]: I1203 12:50:43.500407 4702 generic.go:334] "Generic (PLEG): container finished" podID="98289ec5-f103-435e-90c2-af492a39fa54" containerID="22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4" exitCode=0 Dec 03 12:50:43 crc kubenswrapper[4702]: I1203 12:50:43.500539 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpm7h" event={"ID":"98289ec5-f103-435e-90c2-af492a39fa54","Type":"ContainerDied","Data":"22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4"} Dec 03 12:50:45 crc kubenswrapper[4702]: I1203 12:50:45.538870 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpm7h" event={"ID":"98289ec5-f103-435e-90c2-af492a39fa54","Type":"ContainerStarted","Data":"6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354"} Dec 03 12:50:50 crc kubenswrapper[4702]: I1203 12:50:50.606062 4702 generic.go:334] "Generic (PLEG): container finished" podID="98289ec5-f103-435e-90c2-af492a39fa54" containerID="6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354" exitCode=0 Dec 03 12:50:50 crc kubenswrapper[4702]: I1203 12:50:50.606094 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpm7h" event={"ID":"98289ec5-f103-435e-90c2-af492a39fa54","Type":"ContainerDied","Data":"6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354"} Dec 03 12:50:52 crc kubenswrapper[4702]: I1203 12:50:52.646672 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpm7h" event={"ID":"98289ec5-f103-435e-90c2-af492a39fa54","Type":"ContainerStarted","Data":"e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c"} Dec 03 12:51:00 crc kubenswrapper[4702]: I1203 12:51:00.766867 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:51:00 crc kubenswrapper[4702]: I1203 12:51:00.767878 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.623988 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dpm7h" podStartSLOduration=14.120942295 podStartE2EDuration="21.623962799s" podCreationTimestamp="2025-12-03 12:50:40 +0000 UTC" firstStartedPulling="2025-12-03 12:50:43.502858765 +0000 UTC m=+6427.338787229" lastFinishedPulling="2025-12-03 12:50:51.005879269 +0000 UTC m=+6434.841807733" observedRunningTime="2025-12-03 12:50:52.667308759 +0000 UTC m=+6436.503237223" watchObservedRunningTime="2025-12-03 12:51:01.623962799 +0000 UTC m=+6445.459891273" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.628932 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-82h5b"] Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.631987 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.663417 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82h5b"] Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.703121 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-utilities\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.703172 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-catalog-content\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.703341 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw24q\" (UniqueName: \"kubernetes.io/projected/6f282285-d744-4f17-bdbe-c58123360b92-kube-api-access-bw24q\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.820422 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-utilities\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.820803 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-catalog-content\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.820973 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw24q\" (UniqueName: \"kubernetes.io/projected/6f282285-d744-4f17-bdbe-c58123360b92-kube-api-access-bw24q\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.822027 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-utilities\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.823091 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-catalog-content\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.834946 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dpm7h" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="registry-server" probeResult="failure" output=< Dec 03 12:51:01 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:51:01 crc kubenswrapper[4702]: > Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.851156 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw24q\" (UniqueName: \"kubernetes.io/projected/6f282285-d744-4f17-bdbe-c58123360b92-kube-api-access-bw24q\") pod \"redhat-marketplace-82h5b\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:01 crc kubenswrapper[4702]: I1203 12:51:01.969333 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:02 crc kubenswrapper[4702]: I1203 12:51:02.581240 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-82h5b"] Dec 03 12:51:02 crc kubenswrapper[4702]: W1203 12:51:02.586322 4702 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f282285_d744_4f17_bdbe_c58123360b92.slice/crio-faaa28c5b794ef0516c11cf19a5137dadc44c6e882458cdb13411b59284f8267 WatchSource:0}: Error finding container faaa28c5b794ef0516c11cf19a5137dadc44c6e882458cdb13411b59284f8267: Status 404 returned error can't find the container with id faaa28c5b794ef0516c11cf19a5137dadc44c6e882458cdb13411b59284f8267 Dec 03 12:51:03 crc kubenswrapper[4702]: I1203 12:51:03.395711 4702 generic.go:334] "Generic (PLEG): container finished" podID="6f282285-d744-4f17-bdbe-c58123360b92" containerID="bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715" exitCode=0 Dec 03 12:51:03 crc kubenswrapper[4702]: I1203 12:51:03.396357 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82h5b" event={"ID":"6f282285-d744-4f17-bdbe-c58123360b92","Type":"ContainerDied","Data":"bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715"} Dec 03 12:51:03 crc kubenswrapper[4702]: I1203 12:51:03.396404 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82h5b" event={"ID":"6f282285-d744-4f17-bdbe-c58123360b92","Type":"ContainerStarted","Data":"faaa28c5b794ef0516c11cf19a5137dadc44c6e882458cdb13411b59284f8267"} Dec 03 12:51:04 crc kubenswrapper[4702]: I1203 12:51:04.424977 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82h5b" event={"ID":"6f282285-d744-4f17-bdbe-c58123360b92","Type":"ContainerStarted","Data":"972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead"} Dec 03 12:51:05 crc kubenswrapper[4702]: I1203 12:51:05.442105 4702 generic.go:334] "Generic (PLEG): container finished" podID="6f282285-d744-4f17-bdbe-c58123360b92" containerID="972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead" exitCode=0 Dec 03 12:51:05 crc kubenswrapper[4702]: I1203 12:51:05.442169 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82h5b" event={"ID":"6f282285-d744-4f17-bdbe-c58123360b92","Type":"ContainerDied","Data":"972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead"} Dec 03 12:51:06 crc kubenswrapper[4702]: I1203 12:51:06.724988 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82h5b" event={"ID":"6f282285-d744-4f17-bdbe-c58123360b92","Type":"ContainerStarted","Data":"08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae"} Dec 03 12:51:06 crc kubenswrapper[4702]: I1203 12:51:06.764349 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-82h5b" podStartSLOduration=3.364193798 podStartE2EDuration="5.764320026s" podCreationTimestamp="2025-12-03 12:51:01 +0000 UTC" firstStartedPulling="2025-12-03 12:51:03.404059368 +0000 UTC m=+6447.239987832" lastFinishedPulling="2025-12-03 12:51:05.804185596 +0000 UTC m=+6449.640114060" observedRunningTime="2025-12-03 12:51:06.760128816 +0000 UTC m=+6450.596057290" watchObservedRunningTime="2025-12-03 12:51:06.764320026 +0000 UTC m=+6450.600248490" Dec 03 12:51:10 crc kubenswrapper[4702]: I1203 12:51:10.841573 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:51:10 crc kubenswrapper[4702]: I1203 12:51:10.911946 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:51:11 crc kubenswrapper[4702]: I1203 12:51:11.626688 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dpm7h"] Dec 03 12:51:11 crc kubenswrapper[4702]: I1203 12:51:11.969826 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:11 crc kubenswrapper[4702]: I1203 12:51:11.969885 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:12 crc kubenswrapper[4702]: I1203 12:51:12.032508 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:12 crc kubenswrapper[4702]: I1203 12:51:12.816104 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dpm7h" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="registry-server" containerID="cri-o://e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c" gracePeriod=2 Dec 03 12:51:12 crc kubenswrapper[4702]: I1203 12:51:12.872810 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.643861 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.725182 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxfsf\" (UniqueName: \"kubernetes.io/projected/98289ec5-f103-435e-90c2-af492a39fa54-kube-api-access-qxfsf\") pod \"98289ec5-f103-435e-90c2-af492a39fa54\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.725532 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-utilities\") pod \"98289ec5-f103-435e-90c2-af492a39fa54\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.725726 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-catalog-content\") pod \"98289ec5-f103-435e-90c2-af492a39fa54\" (UID: \"98289ec5-f103-435e-90c2-af492a39fa54\") " Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.728231 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-utilities" (OuterVolumeSpecName: "utilities") pod "98289ec5-f103-435e-90c2-af492a39fa54" (UID: "98289ec5-f103-435e-90c2-af492a39fa54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.733938 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98289ec5-f103-435e-90c2-af492a39fa54-kube-api-access-qxfsf" (OuterVolumeSpecName: "kube-api-access-qxfsf") pod "98289ec5-f103-435e-90c2-af492a39fa54" (UID: "98289ec5-f103-435e-90c2-af492a39fa54"). InnerVolumeSpecName "kube-api-access-qxfsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.829018 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxfsf\" (UniqueName: \"kubernetes.io/projected/98289ec5-f103-435e-90c2-af492a39fa54-kube-api-access-qxfsf\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.829048 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.831838 4702 generic.go:334] "Generic (PLEG): container finished" podID="98289ec5-f103-435e-90c2-af492a39fa54" containerID="e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c" exitCode=0 Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.833911 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dpm7h" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.835192 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpm7h" event={"ID":"98289ec5-f103-435e-90c2-af492a39fa54","Type":"ContainerDied","Data":"e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c"} Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.835545 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dpm7h" event={"ID":"98289ec5-f103-435e-90c2-af492a39fa54","Type":"ContainerDied","Data":"f05b4f6f5de6f0dd1ac9d31a52b21c652d2f441673e5c527e2b686dede3f7c46"} Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.835920 4702 scope.go:117] "RemoveContainer" containerID="e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.845687 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98289ec5-f103-435e-90c2-af492a39fa54" (UID: "98289ec5-f103-435e-90c2-af492a39fa54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.861776 4702 scope.go:117] "RemoveContainer" containerID="6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.885224 4702 scope.go:117] "RemoveContainer" containerID="22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.931375 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98289ec5-f103-435e-90c2-af492a39fa54-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.948565 4702 scope.go:117] "RemoveContainer" containerID="e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c" Dec 03 12:51:13 crc kubenswrapper[4702]: E1203 12:51:13.954367 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c\": container with ID starting with e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c not found: ID does not exist" containerID="e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.954408 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c"} err="failed to get container status \"e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c\": rpc error: code = NotFound desc = could not find container \"e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c\": container with ID starting with e8595a016f816f4dcde002165a0b9fe7419df536d8af564e0cb49525dfa3a74c not found: ID does not exist" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.954434 4702 scope.go:117] "RemoveContainer" containerID="6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354" Dec 03 12:51:13 crc kubenswrapper[4702]: E1203 12:51:13.955157 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354\": container with ID starting with 6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354 not found: ID does not exist" containerID="6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.955213 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354"} err="failed to get container status \"6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354\": rpc error: code = NotFound desc = could not find container \"6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354\": container with ID starting with 6cfdbb1cc6cf71259879844db4fe74878d5810b9cb6b987400a554af5bed8354 not found: ID does not exist" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.955252 4702 scope.go:117] "RemoveContainer" containerID="22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4" Dec 03 12:51:13 crc kubenswrapper[4702]: E1203 12:51:13.955629 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4\": container with ID starting with 22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4 not found: ID does not exist" containerID="22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4" Dec 03 12:51:13 crc kubenswrapper[4702]: I1203 12:51:13.955661 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4"} err="failed to get container status \"22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4\": rpc error: code = NotFound desc = could not find container \"22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4\": container with ID starting with 22e3096eeb4090b3122e56bdbde962c20bcc0341af46e1a98f900b08ca3f14e4 not found: ID does not exist" Dec 03 12:51:14 crc kubenswrapper[4702]: I1203 12:51:14.172242 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dpm7h"] Dec 03 12:51:14 crc kubenswrapper[4702]: I1203 12:51:14.183842 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dpm7h"] Dec 03 12:51:14 crc kubenswrapper[4702]: I1203 12:51:14.217629 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82h5b"] Dec 03 12:51:14 crc kubenswrapper[4702]: I1203 12:51:14.858507 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-82h5b" podUID="6f282285-d744-4f17-bdbe-c58123360b92" containerName="registry-server" containerID="cri-o://08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae" gracePeriod=2 Dec 03 12:51:14 crc kubenswrapper[4702]: I1203 12:51:14.948986 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98289ec5-f103-435e-90c2-af492a39fa54" path="/var/lib/kubelet/pods/98289ec5-f103-435e-90c2-af492a39fa54/volumes" Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.423450 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.483119 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-catalog-content\") pod \"6f282285-d744-4f17-bdbe-c58123360b92\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.483192 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-utilities\") pod \"6f282285-d744-4f17-bdbe-c58123360b92\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.483217 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw24q\" (UniqueName: \"kubernetes.io/projected/6f282285-d744-4f17-bdbe-c58123360b92-kube-api-access-bw24q\") pod \"6f282285-d744-4f17-bdbe-c58123360b92\" (UID: \"6f282285-d744-4f17-bdbe-c58123360b92\") " Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.484650 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-utilities" (OuterVolumeSpecName: "utilities") pod "6f282285-d744-4f17-bdbe-c58123360b92" (UID: "6f282285-d744-4f17-bdbe-c58123360b92"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.523516 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f282285-d744-4f17-bdbe-c58123360b92" (UID: "6f282285-d744-4f17-bdbe-c58123360b92"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.589076 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.589113 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f282285-d744-4f17-bdbe-c58123360b92-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.906636 4702 generic.go:334] "Generic (PLEG): container finished" podID="6f282285-d744-4f17-bdbe-c58123360b92" containerID="08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae" exitCode=0 Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.906678 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82h5b" event={"ID":"6f282285-d744-4f17-bdbe-c58123360b92","Type":"ContainerDied","Data":"08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae"} Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.906707 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-82h5b" event={"ID":"6f282285-d744-4f17-bdbe-c58123360b92","Type":"ContainerDied","Data":"faaa28c5b794ef0516c11cf19a5137dadc44c6e882458cdb13411b59284f8267"} Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.906704 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-82h5b" Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.906724 4702 scope.go:117] "RemoveContainer" containerID="08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae" Dec 03 12:51:15 crc kubenswrapper[4702]: I1203 12:51:15.942069 4702 scope.go:117] "RemoveContainer" containerID="972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.135274 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f282285-d744-4f17-bdbe-c58123360b92-kube-api-access-bw24q" (OuterVolumeSpecName: "kube-api-access-bw24q") pod "6f282285-d744-4f17-bdbe-c58123360b92" (UID: "6f282285-d744-4f17-bdbe-c58123360b92"). InnerVolumeSpecName "kube-api-access-bw24q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.157567 4702 scope.go:117] "RemoveContainer" containerID="bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.206862 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw24q\" (UniqueName: \"kubernetes.io/projected/6f282285-d744-4f17-bdbe-c58123360b92-kube-api-access-bw24q\") on node \"crc\" DevicePath \"\"" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.276902 4702 scope.go:117] "RemoveContainer" containerID="08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae" Dec 03 12:51:16 crc kubenswrapper[4702]: E1203 12:51:16.277308 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae\": container with ID starting with 08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae not found: ID does not exist" containerID="08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.277344 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae"} err="failed to get container status \"08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae\": rpc error: code = NotFound desc = could not find container \"08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae\": container with ID starting with 08e199b90a2b8332b622c18ba3d393e3cc614d01e225f022f3b90464a036eaae not found: ID does not exist" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.277374 4702 scope.go:117] "RemoveContainer" containerID="972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead" Dec 03 12:51:16 crc kubenswrapper[4702]: E1203 12:51:16.277656 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead\": container with ID starting with 972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead not found: ID does not exist" containerID="972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.277684 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead"} err="failed to get container status \"972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead\": rpc error: code = NotFound desc = could not find container \"972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead\": container with ID starting with 972aae0c3e6b398ff30c450e5bf7a1e69707a4f34c52a1f20911a8db2eb08ead not found: ID does not exist" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.277700 4702 scope.go:117] "RemoveContainer" containerID="bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715" Dec 03 12:51:16 crc kubenswrapper[4702]: E1203 12:51:16.278226 4702 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715\": container with ID starting with bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715 not found: ID does not exist" containerID="bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.278259 4702 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715"} err="failed to get container status \"bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715\": rpc error: code = NotFound desc = could not find container \"bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715\": container with ID starting with bff4ae0571b5e09f99041079ea8199efc0458f7ed16852fd23749ac1a4578715 not found: ID does not exist" Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.344589 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-82h5b"] Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.356740 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-82h5b"] Dec 03 12:51:16 crc kubenswrapper[4702]: I1203 12:51:16.952069 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f282285-d744-4f17-bdbe-c58123360b92" path="/var/lib/kubelet/pods/6f282285-d744-4f17-bdbe-c58123360b92/volumes" Dec 03 12:51:55 crc kubenswrapper[4702]: I1203 12:51:55.908236 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:51:55 crc kubenswrapper[4702]: I1203 12:51:55.910665 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.101409 4702 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d8gkk"] Dec 03 12:52:19 crc kubenswrapper[4702]: E1203 12:52:19.102627 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="extract-utilities" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.102648 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="extract-utilities" Dec 03 12:52:19 crc kubenswrapper[4702]: E1203 12:52:19.102684 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="registry-server" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.102692 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="registry-server" Dec 03 12:52:19 crc kubenswrapper[4702]: E1203 12:52:19.102706 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f282285-d744-4f17-bdbe-c58123360b92" containerName="registry-server" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.102714 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f282285-d744-4f17-bdbe-c58123360b92" containerName="registry-server" Dec 03 12:52:19 crc kubenswrapper[4702]: E1203 12:52:19.102724 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="extract-content" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.102731 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="extract-content" Dec 03 12:52:19 crc kubenswrapper[4702]: E1203 12:52:19.102773 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f282285-d744-4f17-bdbe-c58123360b92" containerName="extract-utilities" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.102783 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f282285-d744-4f17-bdbe-c58123360b92" containerName="extract-utilities" Dec 03 12:52:19 crc kubenswrapper[4702]: E1203 12:52:19.102796 4702 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f282285-d744-4f17-bdbe-c58123360b92" containerName="extract-content" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.102805 4702 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f282285-d744-4f17-bdbe-c58123360b92" containerName="extract-content" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.103149 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f282285-d744-4f17-bdbe-c58123360b92" containerName="registry-server" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.103179 4702 memory_manager.go:354] "RemoveStaleState removing state" podUID="98289ec5-f103-435e-90c2-af492a39fa54" containerName="registry-server" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.105425 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.125691 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8gkk"] Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.280851 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-catalog-content\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.281102 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-utilities\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.281267 4702 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bpb\" (UniqueName: \"kubernetes.io/projected/dd326348-0589-45e7-a792-0b4ad81b3e4e-kube-api-access-t8bpb\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.383959 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bpb\" (UniqueName: \"kubernetes.io/projected/dd326348-0589-45e7-a792-0b4ad81b3e4e-kube-api-access-t8bpb\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.384464 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-catalog-content\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.384719 4702 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-utilities\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.385119 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-catalog-content\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.385222 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-utilities\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.406657 4702 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bpb\" (UniqueName: \"kubernetes.io/projected/dd326348-0589-45e7-a792-0b4ad81b3e4e-kube-api-access-t8bpb\") pod \"certified-operators-d8gkk\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.441245 4702 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:19 crc kubenswrapper[4702]: I1203 12:52:19.977552 4702 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8gkk"] Dec 03 12:52:20 crc kubenswrapper[4702]: I1203 12:52:20.985218 4702 generic.go:334] "Generic (PLEG): container finished" podID="dd326348-0589-45e7-a792-0b4ad81b3e4e" containerID="a2de3afff94196accf1b3a44d2dc08a2ad349baa12403cce28ec74443a470ec5" exitCode=0 Dec 03 12:52:20 crc kubenswrapper[4702]: I1203 12:52:20.985317 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8gkk" event={"ID":"dd326348-0589-45e7-a792-0b4ad81b3e4e","Type":"ContainerDied","Data":"a2de3afff94196accf1b3a44d2dc08a2ad349baa12403cce28ec74443a470ec5"} Dec 03 12:52:20 crc kubenswrapper[4702]: I1203 12:52:20.986738 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8gkk" event={"ID":"dd326348-0589-45e7-a792-0b4ad81b3e4e","Type":"ContainerStarted","Data":"7672b8fbffa4243684bd7f2039bf99d5a721e022e329afbf4896eb2892452ad3"} Dec 03 12:52:23 crc kubenswrapper[4702]: I1203 12:52:23.043698 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8gkk" event={"ID":"dd326348-0589-45e7-a792-0b4ad81b3e4e","Type":"ContainerStarted","Data":"e93ed65c491c32484debaeb8b1d671a1d95b56c7eda9eeebbd02611f946c5547"} Dec 03 12:52:25 crc kubenswrapper[4702]: I1203 12:52:25.923704 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:52:25 crc kubenswrapper[4702]: I1203 12:52:25.924134 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:52:26 crc kubenswrapper[4702]: I1203 12:52:26.093398 4702 generic.go:334] "Generic (PLEG): container finished" podID="dd326348-0589-45e7-a792-0b4ad81b3e4e" containerID="e93ed65c491c32484debaeb8b1d671a1d95b56c7eda9eeebbd02611f946c5547" exitCode=0 Dec 03 12:52:26 crc kubenswrapper[4702]: I1203 12:52:26.093483 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8gkk" event={"ID":"dd326348-0589-45e7-a792-0b4ad81b3e4e","Type":"ContainerDied","Data":"e93ed65c491c32484debaeb8b1d671a1d95b56c7eda9eeebbd02611f946c5547"} Dec 03 12:52:28 crc kubenswrapper[4702]: I1203 12:52:28.154316 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8gkk" event={"ID":"dd326348-0589-45e7-a792-0b4ad81b3e4e","Type":"ContainerStarted","Data":"9ac5de14036ae46119e67b7362dbab1200fd449cedc7bed9d5c3d0afca3b3d4a"} Dec 03 12:52:28 crc kubenswrapper[4702]: I1203 12:52:28.191425 4702 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d8gkk" podStartSLOduration=2.7472797350000002 podStartE2EDuration="9.191393418s" podCreationTimestamp="2025-12-03 12:52:19 +0000 UTC" firstStartedPulling="2025-12-03 12:52:20.989640337 +0000 UTC m=+6524.825568841" lastFinishedPulling="2025-12-03 12:52:27.43375402 +0000 UTC m=+6531.269682524" observedRunningTime="2025-12-03 12:52:28.177000247 +0000 UTC m=+6532.012928711" watchObservedRunningTime="2025-12-03 12:52:28.191393418 +0000 UTC m=+6532.027321882" Dec 03 12:52:29 crc kubenswrapper[4702]: I1203 12:52:29.442935 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:29 crc kubenswrapper[4702]: I1203 12:52:29.443352 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:30 crc kubenswrapper[4702]: I1203 12:52:30.502648 4702 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-d8gkk" podUID="dd326348-0589-45e7-a792-0b4ad81b3e4e" containerName="registry-server" probeResult="failure" output=< Dec 03 12:52:30 crc kubenswrapper[4702]: timeout: failed to connect service ":50051" within 1s Dec 03 12:52:30 crc kubenswrapper[4702]: > Dec 03 12:52:39 crc kubenswrapper[4702]: I1203 12:52:39.518679 4702 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:39 crc kubenswrapper[4702]: I1203 12:52:39.586500 4702 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:39 crc kubenswrapper[4702]: I1203 12:52:39.772090 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8gkk"] Dec 03 12:52:41 crc kubenswrapper[4702]: I1203 12:52:41.383553 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d8gkk" podUID="dd326348-0589-45e7-a792-0b4ad81b3e4e" containerName="registry-server" containerID="cri-o://9ac5de14036ae46119e67b7362dbab1200fd449cedc7bed9d5c3d0afca3b3d4a" gracePeriod=2 Dec 03 12:52:41 crc kubenswrapper[4702]: E1203 12:52:41.568150 4702 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd326348_0589_45e7_a792_0b4ad81b3e4e.slice/crio-9ac5de14036ae46119e67b7362dbab1200fd449cedc7bed9d5c3d0afca3b3d4a.scope\": RecentStats: unable to find data in memory cache]" Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.401207 4702 generic.go:334] "Generic (PLEG): container finished" podID="dd326348-0589-45e7-a792-0b4ad81b3e4e" containerID="9ac5de14036ae46119e67b7362dbab1200fd449cedc7bed9d5c3d0afca3b3d4a" exitCode=0 Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.401259 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8gkk" event={"ID":"dd326348-0589-45e7-a792-0b4ad81b3e4e","Type":"ContainerDied","Data":"9ac5de14036ae46119e67b7362dbab1200fd449cedc7bed9d5c3d0afca3b3d4a"} Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.665597 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.763704 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-catalog-content\") pod \"dd326348-0589-45e7-a792-0b4ad81b3e4e\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.764036 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-utilities\") pod \"dd326348-0589-45e7-a792-0b4ad81b3e4e\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.764454 4702 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bpb\" (UniqueName: \"kubernetes.io/projected/dd326348-0589-45e7-a792-0b4ad81b3e4e-kube-api-access-t8bpb\") pod \"dd326348-0589-45e7-a792-0b4ad81b3e4e\" (UID: \"dd326348-0589-45e7-a792-0b4ad81b3e4e\") " Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.764720 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-utilities" (OuterVolumeSpecName: "utilities") pod "dd326348-0589-45e7-a792-0b4ad81b3e4e" (UID: "dd326348-0589-45e7-a792-0b4ad81b3e4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.765451 4702 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.774289 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd326348-0589-45e7-a792-0b4ad81b3e4e-kube-api-access-t8bpb" (OuterVolumeSpecName: "kube-api-access-t8bpb") pod "dd326348-0589-45e7-a792-0b4ad81b3e4e" (UID: "dd326348-0589-45e7-a792-0b4ad81b3e4e"). InnerVolumeSpecName "kube-api-access-t8bpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.809080 4702 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd326348-0589-45e7-a792-0b4ad81b3e4e" (UID: "dd326348-0589-45e7-a792-0b4ad81b3e4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.870056 4702 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bpb\" (UniqueName: \"kubernetes.io/projected/dd326348-0589-45e7-a792-0b4ad81b3e4e-kube-api-access-t8bpb\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:42 crc kubenswrapper[4702]: I1203 12:52:42.870121 4702 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd326348-0589-45e7-a792-0b4ad81b3e4e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 12:52:43 crc kubenswrapper[4702]: I1203 12:52:43.433469 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8gkk" event={"ID":"dd326348-0589-45e7-a792-0b4ad81b3e4e","Type":"ContainerDied","Data":"7672b8fbffa4243684bd7f2039bf99d5a721e022e329afbf4896eb2892452ad3"} Dec 03 12:52:43 crc kubenswrapper[4702]: I1203 12:52:43.433675 4702 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8gkk" Dec 03 12:52:43 crc kubenswrapper[4702]: I1203 12:52:43.433871 4702 scope.go:117] "RemoveContainer" containerID="9ac5de14036ae46119e67b7362dbab1200fd449cedc7bed9d5c3d0afca3b3d4a" Dec 03 12:52:43 crc kubenswrapper[4702]: I1203 12:52:43.472298 4702 scope.go:117] "RemoveContainer" containerID="e93ed65c491c32484debaeb8b1d671a1d95b56c7eda9eeebbd02611f946c5547" Dec 03 12:52:43 crc kubenswrapper[4702]: I1203 12:52:43.489661 4702 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8gkk"] Dec 03 12:52:43 crc kubenswrapper[4702]: I1203 12:52:43.510637 4702 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d8gkk"] Dec 03 12:52:43 crc kubenswrapper[4702]: I1203 12:52:43.519703 4702 scope.go:117] "RemoveContainer" containerID="a2de3afff94196accf1b3a44d2dc08a2ad349baa12403cce28ec74443a470ec5" Dec 03 12:52:44 crc kubenswrapper[4702]: I1203 12:52:44.940775 4702 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd326348-0589-45e7-a792-0b4ad81b3e4e" path="/var/lib/kubelet/pods/dd326348-0589-45e7-a792-0b4ad81b3e4e/volumes" Dec 03 12:52:55 crc kubenswrapper[4702]: I1203 12:52:55.908119 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:52:55 crc kubenswrapper[4702]: I1203 12:52:55.908659 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:52:55 crc kubenswrapper[4702]: I1203 12:52:55.909362 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:52:55 crc kubenswrapper[4702]: I1203 12:52:55.910661 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d55ed227ca79a527f938cb9f1414810c0b9a98eb3aa70ee1eb863c1b8617d357"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:52:55 crc kubenswrapper[4702]: I1203 12:52:55.910811 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://d55ed227ca79a527f938cb9f1414810c0b9a98eb3aa70ee1eb863c1b8617d357" gracePeriod=600 Dec 03 12:52:56 crc kubenswrapper[4702]: I1203 12:52:56.640287 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="d55ed227ca79a527f938cb9f1414810c0b9a98eb3aa70ee1eb863c1b8617d357" exitCode=0 Dec 03 12:52:56 crc kubenswrapper[4702]: I1203 12:52:56.640445 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"d55ed227ca79a527f938cb9f1414810c0b9a98eb3aa70ee1eb863c1b8617d357"} Dec 03 12:52:56 crc kubenswrapper[4702]: I1203 12:52:56.641054 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerStarted","Data":"13822e35eb6c93228566806d0098101d98250fec047aa20956d779b57bf988dd"} Dec 03 12:52:56 crc kubenswrapper[4702]: I1203 12:52:56.641095 4702 scope.go:117] "RemoveContainer" containerID="25e53f05b3b872bdd47d99566f8218358e392f0bcff0ab1ce8d682da3ac5241a" Dec 03 12:55:25 crc kubenswrapper[4702]: I1203 12:55:25.907815 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:55:25 crc kubenswrapper[4702]: I1203 12:55:25.908445 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:55:55 crc kubenswrapper[4702]: I1203 12:55:55.908287 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:55:55 crc kubenswrapper[4702]: I1203 12:55:55.908903 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:56:25 crc kubenswrapper[4702]: I1203 12:56:25.907700 4702 patch_prober.go:28] interesting pod/machine-config-daemon-qf5sd container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 12:56:25 crc kubenswrapper[4702]: I1203 12:56:25.908272 4702 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 12:56:25 crc kubenswrapper[4702]: I1203 12:56:25.908327 4702 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" Dec 03 12:56:25 crc kubenswrapper[4702]: I1203 12:56:25.909480 4702 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13822e35eb6c93228566806d0098101d98250fec047aa20956d779b57bf988dd"} pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 12:56:25 crc kubenswrapper[4702]: I1203 12:56:25.909578 4702 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerName="machine-config-daemon" containerID="cri-o://13822e35eb6c93228566806d0098101d98250fec047aa20956d779b57bf988dd" gracePeriod=600 Dec 03 12:56:26 crc kubenswrapper[4702]: E1203 12:56:26.586792 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258" Dec 03 12:56:26 crc kubenswrapper[4702]: I1203 12:56:26.768215 4702 generic.go:334] "Generic (PLEG): container finished" podID="d2e03cb6-21dc-460c-a68e-17aafd79e258" containerID="13822e35eb6c93228566806d0098101d98250fec047aa20956d779b57bf988dd" exitCode=0 Dec 03 12:56:26 crc kubenswrapper[4702]: I1203 12:56:26.768262 4702 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" event={"ID":"d2e03cb6-21dc-460c-a68e-17aafd79e258","Type":"ContainerDied","Data":"13822e35eb6c93228566806d0098101d98250fec047aa20956d779b57bf988dd"} Dec 03 12:56:26 crc kubenswrapper[4702]: I1203 12:56:26.768298 4702 scope.go:117] "RemoveContainer" containerID="d55ed227ca79a527f938cb9f1414810c0b9a98eb3aa70ee1eb863c1b8617d357" Dec 03 12:56:26 crc kubenswrapper[4702]: I1203 12:56:26.769240 4702 scope.go:117] "RemoveContainer" containerID="13822e35eb6c93228566806d0098101d98250fec047aa20956d779b57bf988dd" Dec 03 12:56:26 crc kubenswrapper[4702]: E1203 12:56:26.773222 4702 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qf5sd_openshift-machine-config-operator(d2e03cb6-21dc-460c-a68e-17aafd79e258)\"" pod="openshift-machine-config-operator/machine-config-daemon-qf5sd" podUID="d2e03cb6-21dc-460c-a68e-17aafd79e258"